Reality Conditions

Monday, November 26, 2007

Searching for a Thesis Quote

My PhD thesis is almost finished and hopefully I will submit it within a few weeks. The most important thing lacking, at the moment, is a suitable quote for the beginning. I hereby enlist the help of my readers for suggestions!

I guess I should say something about the topic of the thesis and what kind of quote I am looking for. The topic of the thesis is particle detectors in quantum field theory. I have given a nontechnical explanation in this old post, but if you don't care to go and read it, enough to say that it is about the possibility of defining the "particle content" of quantum fields operationally, by the energy transitions an interaction with the quantum field can produce on another quantum system such as an atom. If an atom interacting with a field gets excited, you can say that it has absorbed a field quanta or "particle". This is important because in a curved spacetime context there are usually no other "intrinsic" definitions of particles available. My work concerns more precisely the question of giving a rigorous definition of the transition rate of such a detector, which is not as simple as it sounds. You can read the details in my last paper.

For the beginning of the thesis, I do not want a prosaic quote from a physicist about these matters. My ideal would be a poetic, literary or philosophical quote that could, with an effort, be read as alluding to this topic (even though this was obviously not intended). As an example of the kind of thing I like, my undergraduate thesis concerned calculation of vacuum energy of quantum fields in a class of spacetimes. As the reality of quantum field vacuum energy means that there is no real vacuum in Nature, that anything that looks empty really has a "zero-point energy", I used a quotation from Parmenides, the Pre-Socratic philosopher that based his philosophy on denying the reality of Not-Being:

That things which are not are, shall never prevail, she said,
but do thou restrain thy mind from this course of investigation.
And let not long-practised habit compel thee along this path,
thine eye careless, thine ear and thy tongue overpowered by noise;
but do thou weigh the much contested refutation of their words,
which I have uttered.

The Spanish translation I used is much more poetical, for those that can read it:

Pues nunca dominará esto: que haya no ser. Aleja tú
el pensamiento de este camino de investigación,
y que la inveterada costumbre no te obligue, a lo largo
de este camino, a utilizar el ojo que no ve, el oído que
resuena, y la lengua; juzga con la razón la combativa
refutación que te he enunciado.

Besides the rejection of Not-Being, the quote was appropiate for talking about "investigación", which in Spanish means "research" besides "investigation", and is used everyday in scientific context. Also, the rejection of the senses in favour of reason can be seen (with a slant) as endorsing theoretical physics over experimental.

So this is the kind of thing I would like. At the moment, my best candidate is the following quote from Bertrand Russell's An Outline of Philosophy:

'Matter' is a convenient formula for describing what happens where it isn't. I am talking physics, not metaphysics; when we come to metaphysics, we may be able, tentatively, to add something to this statement, but science alone can hardly add to it.

Reasons why this quote is appropriate are that (although he was not exactly talking about the same thing) Russell seems to be endorsing the operational definition of particles my thesis is about; that the ending of the quote looks ironical as a preface to a hundred pages "adding to it" from a scientific point of view; and that An Outline of Philosophy is a very dear book to me, being the first real philosophy book I read. Rereading it recently I found it full of things I could not accept, either scientifically outdated or philosophically unsound; but its general spirit of approaching philosophy in a way closely related to and interwoven with science is one that I still admire. Reason counting against this quote is that it is a bit too prosaic; I would like something more dramatical and unexpected. Russell is so well-known in the scientific community that quoting him is only slightly less predictable than quoting Einstein or Feynman. But still for the moment this is the best I have found.

Any suggestions...?

Labels: , ,

Wednesday, October 31, 2007

Yet more Shameless Self-Promotion

Transition rate of the Unruh-DeWitt detector in curved spacetime


Jorma Louko, Alejandro Satz

Abstract: We examine the Unruh-DeWitt particle detector coupled to a scalar field in an arbitrary Hadamard state in four-dimensional curved spacetime. Using smooth switching functions to turn on and off the interaction, we obtain a regulator-free integral formula for the total excitation probability, and we show that an instantaneous transition rate can be recovered in a suitable limit. Previous results in Minkowski space are recovered as a special case. As applications, we consider an inertial detector in the Rindler vacuum and a detector at rest in a static Newtonian gravitational field. Gravitational corrections to decay rates in atomic physics laboratory experiments on the surface of the Earth are estimated to be suppressed by 42 orders of magnitude.

Labels: ,

Saturday, October 06, 2007

Spinny Foamy talk

Last Tuesday, Johnathan Engle gave a talk for the International Loop Quantum Gravity Seminar. The talk was on the new spin foam models that have been proposed this year; namley, this one by Engle, Pereira and Rovelli and this one by Freidel and Krasnov. We have been having some discussions on both papers here at the Nottingham group as well. The two things that seem to me more important are the possibility of getting a better understanding of the role of the Immirizi parameter, which for first time is appearing in spin foam models, and the possibility of checking which is the correct dynamics via semiclassical calculations.

The audio and the slides of Engle's talk are available at the above link. If you are interested in getting an idea of the current state of research in the LQG/spin foam community, I encourage you to listen to the full audio. About the last half of it is an hour-long discussion between (mostly) Rovelli and Freidel. For me it gave me a strong feeling of how little really is known, and how basic are the disagreements that are still possible between people follwing essentially the same research program. It is something what does not see so directly in papers or even in most conference talks.

Labels:

Friday, September 21, 2007

Quantum Gravity Colloquium: the discussion. (QM vs. QG: the Grudge Match!)

This is a continuation of the previous post. The discussion session on “Are the foundations of Quantum Mechanics relevant for Quantum Gravity”, informally chaired by Anna Gustavsson, was the ending of the colloquium and one of its high points.

The two main opposing positions (basically “No” and “Yes”) were championed by Frank Hellmann and Jamie Vicary respectively. I took what started as a compromising position and was driven later to stand on Frank’s side when Jamie became more and more iconoclastic. Anna sided with Jamie, but I think few of the rest did, though I can’t really remember any other of the specific opinions voiced.

Frank outlined the “conservative” position: Quantum mechanics works extremely well and has convincingly passed experimental tests in a very large range of scales. Granted, we don’t fully understand its ontological consequences, and thinking about them is a legitimate issue; but there is no reason not to be confident in applying the formalism to the problem of quantum gravity. There are specific technical problems that arise in this application of quantum principles that do not arise in others (e.g. the problem of time and the “partial observables” formalism Frank himself has worked in) but in his opinion, these problems can be examined “orthogonally”, so to say, to the philosophical/interpretational problems.

I am much less convinced that both kinds of problems can always be neatly separated. At first glance, making sense of the quantum formalism in a “timeless” context would seem to require re-examining the notions of measurement and “collapse” in the usual formalism, which are strongly time-asymmetric. The opinion I voiced in the discussion was that, perhaps you don’t need to solve the problem of the foundations of QM to do QG, but at least you will have to worry about it. In the context of normal applications of QM, we have proof that the various “interpretations” (naïve Copenhaguen, Everett, Bohmian…) are not distinguishable in practice and only differ philosophically; for example, decoherence ensures that even if the wavefunction never really “collapses”, it would appear as if having collapsed after a measurement. In the context of QG, we lack a proof that all interpretations give the same answer to questions, or that questions about observables can be framed in a way that is neutral between interpretations. This is simply because we do not know what QG is! So when building up your theory of quantum gravity, you should think about which interpretation you are endorsing, and what do “states”, “measurements” and other thorny terms mean in your theory.

Frank sort of agreed with me on this, but said that his paper with Rovelli, Perez and Mondragon proved that the meaning of these terms in generally covariant quantum theory is as “interpretation-independent”, from a practical point of view, as in ordinary quantum theory. So the worry I raised was legitimate, but has been addressed and solved already. I am unconvinced. The paper considers a particular way of defining the quantum observables in the background-independent case. It applies to quantum theories which are canonical quantizations of a classical theory and where the quantities parametrizing the classical configuration space are promoted to kinematical quantum observables. The problem of what happens with the collapse for several measurements “at different times” is considered, and the answer is given (as far as I understand) that one needs to consider the measuring apparatus of the first measurements as quantum as well, and only apply the formalism to calculate the probability for an outcome in the “final” measurement, where the system and the apparatuses that measured it previously are observed simultaneously. This is clever, but I don’t think it solves the problem as conclusively as Frank implied. Can we be certain that the correct theory of quantum gravity will have the form described above? Can’t the correspondence between classical and quantum observables be more complicated and subtle that in this kind of theory? Just to pick an example, does the formalism outlined apply to a theory of quantum gravity that is defined, as string theorists claim it can be done, by an holographic correspondence between the spacetime and a very different theory on its boundary? How are the observables of such a formalism to be understood? I think it is quite rash to claim, at the present level of ignorance, that we know for certain that quantum gravity can be developed without worrying any more about the philosophical interpretation.

Anyway, this disagreement with Frank played a very short part in the discussion, compared to the more basic argument between both of us and Jamie. Apparently relishing the role of contrarian, Jamie made some strong “radical” claims against the “conservative” position. He said that not only we do not understand the meaning of our present quantum theory; its basic assumptions go also largely unquestioned and are moreover very likely to be false in the realm of quantum gravity; therefore we ought not to try to keep its basic structure, but get rid of it and devise new kinds of theories to attack the problem of QG. When asked what were the “unquestioned basic assumptions” he mentioned a smooth, continuum spacetime, the use of complex numbers, the basic concept of “probability” as a positive real number, and perhaps some other I have forgotten. The rest of us pointed out that the quantum formalism by itself does not require a smooth spacetime and that in many QG approaches spacetime is discrete while the standard rules of QM are preserved; but Jamie seemed to think there is some inconsistency in this. For example, he said that probabilities would not be real numbers unless there was a real physical continuum –but as Frank said, discrete systems such as a qubit can be in an arbitrary superposition state parametrized by real numbers, and besides even if the actual probabilities existent in nature were all combinatorial and measured by rational numbers that would hardly a fundamental difference to the standard situation meriting a whole different formalism. Jamie (and I think someone else as well, but I can't remember whom) also talked about the standard idea of probability as presupposing the possibility of repeating the exact same experiments, which is not possible if spacetime is dynamical. It was answered that, by that reasoning, quantum electrodynamics would require the assumption of a constant background electromagnetic field, which is not true; and that spacetime being dynamical does not preclude the possibility of setting local regions of spacetime in a desired local state –e.g., to do repeatable graviton scattering experiments, if it was technically possible (as black hole production is in large extra dimension models). Another possible answer, which didn’t come to my mind during the discussion, is that probability does not need to be defined by frequencies in repeated experiments; it can be defined by degree of belief as in Bayesianism or by physical “propensities” inherent in particular systems. (A philosopher who specialized in these subjects told me once that the frequentist interpretation of probability is completely discredited and nobody endorses it anymore. This may be an exaggeration, but physicists tend to take too much by granted that probabilities just "mean” frequencies in repeated experiments.)

Anna sided with Jamie, arguing that QM has worked well for the other three forces but not for gravity, which would indicate that something different is needed in this case. Jamie was firm on the idea that different mathematical frameworks need to be explored, and are more likely to work than the old quantum one. Frank and I answered that on the contrary, the most reasonable way to do research in absence of experimental information is to pay maximum respect to ideas that have worked exceptionally well in tested areas, and that Jamie’s method would lead to completely arbitrary new theories, most likely with no relation to reality. Historically, new paradigms are developed replacing old ones when empirical evidence makes it clear that the old ones will not work and gives pointers to new possibilities; trying to develop them in vacuo, in a purely philosophical way, does not lead to anything. (Einstein’s development of GR is perhaps an exception, but it is a unique case. The 200 years of unsuccessful efforts to replace Newton’s action at distance by some kind of mechanical model of gravity is a more likely comparison to what Jamie was proposing, at least in my opinion.)

I think it was Anna who suggested at some point that “quantizing general relativity” was perhaps the wrong track for research; perhaps what we would call “quantum gravity”, in the sense of being the microscopic structure of spacetime, has nothing to do with a quantization of Einstein’s GR. I agreed on this but I thought that this is independent of whether the QM formalism needs modification; string theory would seem to satisfy Anna’s description while being “orthodox” with respect to quantum foundations, as far as I understand it.

This is all I remember of the discussion. It is very likely that my memories are partial, incomplete and downright inaccurate. I beg the participants to join in the comments to give their own account, correct my misrepresentations, add things I forgot, and of course… continue the discussion! People who where not present at the moment are also invited to this last thing, of course.

Labels:

Monday, September 17, 2007

Quantum Gravity Colloquium: the talks

Last Friday I went to London for a quantum gravity student colloquium at the Imperial College. The idea was to meet up with postgraduates working in quantum gravity in UK and other European countries, and have some give talks explaining their research, with ample time for questions and discussion, in a very informal setting. Keeping it student-only makes it easier to dare to ask potentially stupid questions and voice one's opinions. We had already had one such meeting in Cambridge several months ago, and this second experience was as good as or better than the first.

This time we had four seminar-like talks, plus an open discussion session on the topic "Are the foundational problems of quantum mechanics relevant for quantum gravity?" Starting with the talks: Leron Borsten talked on the entropy of black holes in supergravity theories, and intriguing connections it has with entanglement in quantum information theory. There is no clear picture yet in which to view the black hole entropy as entanglement entropy, but there are some identities or isomorphisms between the mathematical description of both concepts that do not look as a coincidence. Yousef Ghazi-Tabatabai talked on an approach to interpreting quantum mechanics which seemed related to the ideas pushed forward by Rafael Sorkin in his plenary talk at Morelia.

Frank Hellmann (who also must be given the lion's share of the credit for organizing the colloquium) talked on "Partial Observables", an approach to defining quantum observables in generally covariant theories and, potentially, solving the problem of time. It has been long championed by Carlo Rovelli, with whom Frank worked before coming to Nottingham. The talk explained how observables that are evolving in a conventional framework can be recast as Dirac observables when the dynamics is written in a generally covariant way. These are fit to answer the question "If the system is in physical state Rho, what is the probability of seeing the correlation (x,t)?" (where x,t are the variables of the classical configuration space). The answer to this question is Tr (Rho P(x,t)), where P(x,t) is the operator that projects states into the physical state which is itself the projection, onto the physical Hilbert space, of the kinematical state corresponding to correlation (x,t). Sadly little time was left by the end of the talk for Frank to discuss the thorny case of multi-time measurements, which is the real centrepiece of his paper.

Eugenio Bianchi gave an excellent talk on Perturbative Regge Calculus and Loop Quantum Gravity. It was a version of the talk he gave at Morelia, but with the math replaced by the concepts, which was much better! I think this stuff is extremely important and I am hope to start working into it in the future, so I will summarise the talk in more detail than the previous ones. I would be extremely happy to receive comments discussing it or pointing out mistakes in my exposition.

Loop Quantum Gravity is an essentially non-perturbative theory. Any attempt to find a "semiclassical limit" and connect it to established physics is complicated by the fact that semiclassical physics is essentially perturbative; so there is the problem of even mathematically connecting the two frameworks, before a concrete calculation to see if they agree can be done. One way of doing this connection is the boundary amplitude formalism introduced by Rovelli. Take a kinematical semiclassical state, a kinematical state given by a superposition of spin networks which is a Gaussian peaked on on a classical spacetime (and choose this to be flat space). Fix this as the state on the boundary of a region, and you can compute correlations of observables, measured in the boundary, due to the dynamics in the interior. Use a spin foam model to specify the dynamics: for example, the Barrett-Crane model. You can then calculate, based on a nonperturbative theory, semiclassical correlations of your dynamical variables, which are spins jmn. You obtain results. However, you don't know if your initial theory that defines the boundary state (LQG) is correct, nor if the spin foam model you have chosen to encode the dynamics is correct, and besides that as the whole conceptual and calculational framework looks very different from things used in other areas of physics, you would really really want something to compare your results to as a check.

Enter Perturbative Regge Calculus. Regge Calculus is an approximation scheme to GR in which the curved manifold is replaced by a skeleton triangulation, with the geometry encoded in the discrete edges and vertices. As Eugenio stressed, it can also be thought of as exact (not approximated) GR but with piecewise flat metrics instead of continuous metrics. Choose a triangulation that discretizes flat space; it is described by the connectivity C and a set of edge lengths Li. Now add small perturbations to the edge length variables, and quantize these perturbations. You are now doing Quantum Perturbative Regge Calculus, which is a straightforward background-dependent, perturbative quantum theory, in which all the standard rules of the game apply. Change your variables from edge lengths to face areas, and calculate the quantum area-area correlations on the boundary of a region. Compare them with the fluctuation in spins jmn calculated from the non-perturbative theory. Are there equal?

The answer is that, for the Barrett-Crane model dynamics, all quantities compared between both calculations up to the 3-point function match exactly, provided that one identifies the spins used as variables in LQG with the areas used in Regge calculus, up to the factor 8 Pi G b, with b being the Imirizi parameter! This is an independent and nontrivial check of the famous LQG spectrum area, which was derived at a purely kinematical level. Here the dynamics is necessary to ensure the right correspondence between areas and spin variables. For example, if one uses a non graph-changing Hamiltonian to define the dynamics, the correspondence is not recovered. The calculation with the new spin foam model recently proposed by Rovelli, Pereira and Engle, which cures certain problems with the Barrett-Crane model, is still not completed and its results are eagerly awaited.

It should be stressed that, though nontrivial, this check does not amount yet to confirming that LQG has the "correct" semiclassical limit. The only perturbative theory of quantum gravity that has right to be acknowledged as "correct" (because it uses uncontroversial quantum field theory principles in an unobjectionable way) is the Effective Field Theory approach popularized by Donoghue. It is not known if quantum perturbative Regge calculus is a sort of discrete equivalent of this; understanding the connection between them would be a key step forward. But the calculation Eugenio talked about has great importance by itself, because it shows that a fully nonperturbative approach to quantum gravity, when used in conjunction with a semiclassical state, can give the same answers as a better understood perturbative approach.

As this post is getting longish, I will break it here and post in one or two days about the discussion on the relation between foundations of QM and QG. Stay tuned.

Labels:

Sunday, July 29, 2007

Loops 07: Conference report (part 3, including discussion session)

Saturday 30/07

There were only two plenary talks this morning, followed by a discussion session. The first was by John Stachel, who is a specialist on philosophy and history of physics (with special reference to Einstein and relativity) . He introduced a general philosophy called "measurability analysis", which is based on analyzing and defining possible measuring processes and abstracting from them the quantities that need to be quantized (transformed into non-commuting operators). His analysis of GR suggests, to him at least, that the projective and the conformal structures of spacetime geometry are "what needs to be quantized" in quantum gravity. The second one was by Michael Reisemberger, who sketched with admirable clarity a canonical formalism for GR in which initial data are on two null intersecting hypersurfaces. The plus is that null initial data are free, not subjected to constraints. He provided a definition of the Poisson Bracket in this formalism and suggested that quantization leads to area discretization, though this is not yet solid ground.

In the discussion session, Carlo Rovelli followed the same procedure used in the Zakopane lectures and read some selected questions from a notebook that had circulated among the audience the previous days for people to write them. Obviously there were dozens of questions written and Rovelli, given the time constraints, had to select only a handful of them for people to discuss. These are the ones that made the cut:

1) Do we expect topology change in Quantum Gravity?

Oriti said that he expects a general framework to allow for topology change, but probably the classical limit can only be recovered on a sector that disallows it. Ashtekar was of the opinion that that canonical LQG framework must allow macroscopic topology change (if lots of spins become trivial, we have macroscopically a "branching" spacetime)

2) What is the relation of Quantum Gravity to the foundational questions in Quantum Mechanics?

Obviously, a question that provokes a lot of discussion. Thiemann and Rovelli are conservatives who think that QG and foundations of QM can be treated separately -for Rovelli's reasons see what he said in Zakopane. Bianca Dittrich thinks that we need to develop our understanding of relational observers. Lucien Hardy said that GR is at least as radical as QM, so it is unlikely that it can be treated with the standard QM framework. John Donoghue disagreed: according to him, effective field theory shows that GR breaks down at high energies, so the sensible thing is to modify it and keep QM. (As I said in a previous post, this overlooks the fact that in this context the what is meant by GR is not the exact Einstein theory, but the conceptual fact that spacetime is dynamical and not fixed.)

3) Could there be experimental consequences of fluctuating causal structure?

Sabine Hossenfelder mentioned the possible consequences for arrival time of photons, but stressed that this comes only from a phenomenological model with no relation to underlying theory. Hardy said that a "fluctuation", as a superposition between two classical states, would need some kind of interference experiment to observe, which is very difficult to be realizable in practice. I think Ashtekar got into a discussion with him here, but I couldn't follow it well enough to take notes -anybody remembers? Martin Reuter said that causal structure may be different for different observables used to probe it, and especially the scale of these observables.

4) What is finite in spin foam models?

Alejandro Perez gave a rather technical answer, of which the only notes I managed to take say: "some models (in 4D) are finite, some are not". Whoa, that's informative. Sorry.

5) Do we expect the fundamental theory to be combinatorial, or to be embedded in a pre-existing manifold?

Rovelli pointed out the conflict between Thiemann's new "Algebraic Quantum Gravity" approach, which is purely combinatorial, and Smolin's program to recover matter from graph braiding, which requires graphs to be embedded. Thiemann said that matter can be included in the algebraic approach, just as a part in the complete Hamiltonian. (Obviously, it would be more appealing if we could derive matter instead of putting it by hand -but can we?) José Antonio Zapata said that the basic thing we need to do is to understand how to build up a quantum theory on a differential manifold (one not previously equipped with a metric structure, I gather).

And now the last question. It asked, to all plenary speakers, to say they "dream for Loops '17"; that is, on their most optimistic possible view, what is the title and abstract of the talk they imagine themselves presenting within ten years?

Many of the answers were predictable and variations of a basic template: abstracts saying "we present a complete theory of quantum gravity with testable (or, in the most ambitious cases, confirmed) predictions." Ashtekar said something like this, adding that his estimated probability for this scenario was 0%. (But he also gave the in my opinion rather optimistic figure of 50% for the probability of having some experimental evidence to start resolving ambiguities.) Reuter had one of the most concrete dreams: "It is shown that LQG is equivalent to Asymptotic Safety, and that that the quantuization ambiguities in it are finite in number and equivalent to the dimensionality of the Non-Gaussian Fixed Point." And finally, there was an extremely amusing exchange between Thiemann and Alejandro Perez, which is a fitting conclusion to this series of posts:

Thiemann (reading his dream abstract): "We present quantum gravity corrections to the electron fine structure, and find that they are in agreement with experiments carried out by the author"

[laughter from the audience]

Perez (reading his dream abstract): "We show that Thiemann's calculations are totally wrong."

[hysterical laughter from the audience]

Labels:

Wednesday, July 25, 2007

Loops 07: Conference report (part 2)

This is the second part of my conference report on Loops 07. The first part was here. Remember that you can download the slides or the audio for most of the talks from the conference webpage.

Wednesday 27/06
Plenary talks

Moshe Rozali started by giving an excellent talk about background independence in string theory, a topic that has been subject of legendary long discussions on Cosmic Variance and other blogs. The main points of his talk were: a) Perturbative string theory is in fact background independent, being a generalization of GR in a background field gauge; it's just that the perturbative framework makes the background independence non-manifest. b) Holographic dualities provide a way of archiving background independence in a more explicit way. In AdS/CFT, a gauge theory on the boundary can be manifestly diffeomorphism invariant and be equivalent to quantum gravity in the bulk of AdS. Rozali stressed that only the asymptotics of AdS (i.e. a particular negative value of the cosmological constant) need to be fixed; the interior geometry is completely dynamical. Ashtekar seemed to disagree about the extent of this statement and tried to press for a discussion in the question session, but it was interrupted for lack of time.

Klaus Fredenhagen talked about QFT in curved spacetime as a route quantum gravity. He extolled the virtues of Algebraic Quantum Field Theory and the techniques of microlocal analysis to provide a sound axiomatic foundation to QFT in curved spacetime, and explained some recent results proven in this area. Then he discussed the application of this formalism to the graviton field treated as a perturbation around a classical background, and wondered about its relation to the methods of effective field theory.

My namesake and compatriot Alejandro Perez (with whom I have been confused a couple of times for those two reasons, though we look nothing alike) gave a rather technical talk involving strings, BF theory and path-integral sum over topologies. I wish I could say more about it, but I got lost soon after the introduction.

Martin Reuter gave an exceptionally clear and compelling presentation on Asymptotic Safety in Quantum Gravity. It covered more or less all the ground that he had covered in the Zakopane lectures in March, which I will not summarise again (click on the link), but also a few tantalizing new implications for cosmology. If the results of the "Einstein-Hilbert truncation" are accepted as approximately true, then the physical cosmological constant "runs" with the scale in the following way: it is constant (at its currently observed tiny value) at lengthscales larger than 10^(-3) cm, and then starts growing as the fourth power of momentum (inverse length) until the Planck scale is reached, and from there on it grows quadratically. This means that in the early universe it was much larger than in the present but decreasing as the universe increased its scale. This provides a natural mechanism for inflation without any driving field. The inflation was driven by the same cosmological "constant" that we see today, and was due to the intrinsic running with scale of this parameter. Reuter had some calculations that seemed to show his model gives good results for the entropy of the universe, as well as a scale-invariant perturbation spectrum.

This is obviously the kind of thing that is either brilliantly right, or completely wrong. The "dark matter + small cosmological constant + inflation" model that is accepted in conventional cosmology gives predictions of extraordinary accuracy for many different observations (at least with respect to its first two elements). A lot of care would be needed to examine if Reuter's model can really emulate all the confirmed predictions, and whether it can make new ones that are testable. But if Reuter is right, then his talk was by a large margin the most important in the conference.

There were no parallel sessions on Wednesday afternoon, which was a free afternoon.

Thursday 28/07
Plenary talks

Daniele Oriti talked about Group Field Theory (GFT). According to him, GFTs (nonlocal field theories on group manifolds) can be interpreted as "second-quantized quantum gravity". They can be used as a general framework in which to rewrite discrete quantum gravity approaches such as LQG and spin foams. Oriti hopes that the elusive semiclassical limit of these theories may be more tractable with GFT methods. Instead of studying e.g. coherent semiclassical spin network superpositions, take a hugely populated "multi-particle state" of the GFT. The techniques of statistical field theory, used for the semiclassical limit of quantum mechanics in condensed matter theory, are suited to be applied to GFTs. By this way one may hope even to define notions of "temperature" and "phases" as they apply to quantum spacetime. One interesting result that he mentioned by the end, without much explanation, is that GFTs must be Fermi-quantized in the Lorentzian case and Bose-quantized in the Riemannian. Can anyone explain to me what he meant by this?

By this time I was feeling ill and with a bit of temperature (I had been warned against the local food, but...), so I went back to my hotel room to have some medicine and rest an hour or so. I thus missed David Rideout's talk on supercomputers and came back for Martin Bojowald's on effective field theory applied to LQG, on which I had put high expectations. Bojowald rewarded these expectations by dedicated one slide of his talk to quoting this blog…




…well, not exactly. The idea of the talk was to replace exact equations for quantum states by semiclassical, effective equations for a finite number of moments of a state (expectation value, fluctuation, etc.) This method is applied successfully to quantum cosmology. He hinted at the end at possible observable consequences in the inflation perturbation spectrum and at computable corrections to the Newtonian potential (meaning the 00 component of the metric in FRW cosmology). These do not seem to match those computed in Donoghue's ordinary effective field theory, but I'm not sure if this isn't because this is a different meaning of "Newtonian potential".

I kept feeling ill and missed almost all the other talks of the day, and didn't take notes in the few I attended. These included talks in the parallel sessions by Sundance Bilson-Thomson and fellow blogger Yidun Wan on models in which spin network braids are standard model particles. Next day would see Lee Smolin champion the same idea in a plenary talk. I returned early to rest in my hotel room and watch Argentina beat USA by 4-1 at football.


Friday 29/07
Plenary talks

As I was still not feeling perfectly well, I slept till late and attended only the last two morning talks. The first was by blogfriend Sabine Hossenfelder, on Phenomenological Quantum Gravity. She has written up the introduction to the talk in this post, so I can do nothing better than recommend you to read it. The rest of the talk examined the generic predictions made by models such as Minimal Length, Generalised Uncertainty Principle and Deformed Special Relativity. According to her, the main problem with all these models is an insufficient connection with fully developed fundamental theories.

Lee Smolin, as I said, talked on braided QG structures as elementary particles. He started making the point that for LQG and related models of quantum spacetime to work, it is needed to explain how low-energy excitations (gravitons, photons, etc.) can propagate through the spacetime foam without decohering with it. That is, one needs to identify "noiseless subsystems" and a ground state on which they propagate coherently, protected by an emergent symmetry. Then he presented the main result: a class of spin network models exists whose simplest coherent excitations (braided, embedded framed graphs) match the quantum numbers of Standard Model 's first generation of fermions. Higher generations can can be included, at the cost of some exotic states. Interactions can be included. (But he did not say the crucial thing: if these "interactions" match, or can be made to match, the U(1)xSU(2)xSU(3) gauge structure of the Standard Model.) Open problems are to include symmetry breaking and masses (all these degrees of freedom are massless), find momentum eigenstates and conservation laws.

I can understand Smolin's excitement about these ideas, but for the moment I remain highly skeptical about them. The Standard Model is a lot more than a table with quantum numbers, and without much more development it will be hard to convince me that the behaviour of some pretty knots can reproduce the rich mathematical structure of Quantum Field Theory.


Parallel sessions

I chose to go to the sessions centred on black holes. William Donelly gave a talk on entanglement entropy of spin networks, and its use in calculation of black hole entropy. Ashetekar expressed skepticism, saying that those calculations did not include the fact that the surface used is a black hole horizon; Donelly answered that he assumes that any surface will have entropy for some observer accelerating in a way so that the surface is a horizon to him. Daniel Termo talked on how the bulk entropy of a graph scales with its boundary, hoping to identify a "holographic regime" of LQG. The conclusion is that LQG will not be holographic, unless the Hamiltonian constrain reduces dramatically the allowed graph complexity. Bad news, I guess. Yidun Wan talked a second time, this time giving the talk of his colleague Mohammed Ansari who couldn't make it to the conference. It was on an alternative framework to the "isolated horizons" one for dealing with quantum black holes. By a reasoning I could not follow, macroscopic corrections to Hawking radiation were predicted; Ashtekar was again skeptical. Another talk worth mentioning was Jacobo Diaz-Polo's on the old problem of the black hole area spectrum in LQG. Jacobo and his collaborators did exact numerical calculations of the area degrees of freedom, without the approximations used for analytical calculations. They obtain, as usual, the Bekenstein-Hawking entropy as leading term (up to a choice of the Imirizi parameter) and a universal logarithmic correction with prefactor -1/2. The number of states as a function of the area has an interesting structure with evenly spaced peaks of degeneracy. If as a first approximation one considers only the states on the peaks, one gets an equidistant area spectrum and the Bekenstein – Mukhanov effect. Of course, all of this is purely kinematical (Jacobo himself stressed it) and the question of how to incorporate the dynamical constraint seems to remain as elusive as always.

This will be enough for today. My next and last post on the conference will describe the last day's two plenary talks and the discussion session that closed the conferemce. As always, if anyone has anything to add to my summaries, thinks I forgot something important, or wants to correct some egregious mistake, they are more than invited to do so.

Labels:

Tuesday, July 17, 2007

Loops 07: Conference report (part 1)

This will be a very long post, or more likely, the first of a series of very long posts. So let me skip quickly over the praises for the quality of the conference and the people present (I met many old friends, both from the real and the virtual worlds) and go directly into the physics. Remember that you can go beyond my comments and get both the slides and audio for most of the talks at the conference website.

Monday, 25/06
Plenary talks

Lucien Hardy talked on the causaloid formalism for quantum gravity. It was actually a foundations of quantum mechaincs talk, based on a "operationalist" philosophy: data are recorded, and physics tries to predict probability correlations among data. These probabilities behave different if data are from "causally connected regions" or not; this allows a definition of what is meant by causal connection in background-independent theories. I found the talk interesting but think that concrete progress in quantum gravity is unlikely to come from such an extremely "top-down" approach. As a matter of philosophical principle, I am suspicious of theories motivated by philosophical principles.

Rafael Sorkin on "anhomomorphic logic". Another foundations of QM talk. Sorkin favours a quantum logic interpretation, in which propositions describing unobserved microevents (e.g. "the particle passed through the lower slit") are assigned truth values that behave according axiomes different from classical logic. Besides what I said above on Hardy, I was especially suspicious of this approach because it "adds sturcture" that is not present on the bare quantum mechanics formalism, for reasons that I find unmotivated.

John Donoghue talked next on Effective Field Theory of General Relativity. This was a much-expected talk, and it was also referenced by many of the following speakers. It was an introduction to effective field theory and the way it provides a consistent perturbative theory
of quantum gravity, for sub-Planckian energy scales. Donoghue emphazised that scattering amplitudes and physical results such as the first quantum correction to the Newtonian potential can be calculated unambiguously and independently of the high-energy completition of the theory, and that any theory that pretends to provide this completition (such as LQG) must recover these corrections as well as the classical "zeroth-order" theory. According to Donoghue, the Problem with capital P is not "reconciliating GR and QM" but finding the fundamental high energy theory that completes quantum GR at the Planck scale. I think, however, that when most people in what is loosely called the "LQG community" talk about reconciliating GR and QM, they understand it as implying much more than what is provided by effective field theory. What is wanted is a quantum theory in which spacetime is fully dynamical, and the EFT results (while important, and truly a nontrivial check for any proposed theory) are still very far from this, as they are based on the perturbative framework of QFT.


Parallel sessions

My own talk "The transition rate of an Unruh detector in a general spacetime" was scheduled for one of the Monday afternoon sessions. It went quite well, with only a brief question by Jerzy Lewandowski during the presentation, and no questions afterwards (though a couple of persons came to talk to me expressing interest later). I suspect many people couldn't understand much, between my ilegible handwriting in the transparencies (I promise to use software next time!), the bad quality of the projector, and the high speed of my speaking due to nerves. I was feeling uncommonly nervous, both before and after the talk, and I didn't take many notes on other talks that afternoon. I have some notes on Rodolfo Gambini's talk, about how quantum mechanics is modified when instead of an abstract time variable we use a physical clock, subjected to decoherence, in the Schroedinger equation; of course, "unitarity" in this time variable is lost. Then he argued that there are fundamental limitations to any clock a the Planck scale, and therefore quantum mechanics would need modifications there. I think that a "timeless" formalism of QM (as Rovelli, Oeckl and others have tried to build) is needed before one can assess these arguments. Guillermo Mena-Marugan and Iñaki Garay talked about quantizations of restricted classical solutions of GR, the Gowdy model and Einstein-Rosen waves respectively; Iñaki had some nice plots of quantum solutions exhibiting both classical and non-classical behaviour. Garrett Lisi then talked on his ambitious "theory of everything" that attempts to describe the whole Standard Model, gravity included, with a single Lie group, E8. I thought when hearing it that it was just a formal game and was surprised to see Lee Smolin ask interested questions, and even more when I saw that John Baez had wrote a whole TWF column on this theory.


Tuesday, 26/06
Plenary talks

This was "the big LQG day", with talks by heavyweights Thiemann, Ashtekar and Rovelli. There was also a talk by Jan Ambjorn about the discrete sum over histories approach, but I missed it.

Thomas Thiemann gave a summary of things known and unknown in Loop Quantum Gravity. For me it added little to to what he had covered in the more complete series of lectures in Zakopane. "Secured land" includes the kinematical framework, the LOST theorem, the area operator spectrum, and kinematical coherent states. "Uncharted territory" includes his more recent Master Constraint Operator (M) to define physical states and the checking of its good semiclassical behaviour. "Open problems" are whether 0 is in the spectrum of M or whether there are anomalies; the resolution of quantization ambiguities in the definition of M; a systematical calculational framework for physical states; a connection with quantum field theory in curved space, with perturbative theory, and a definition of gravitons and Feynman graphs; and conceptual issues related to the problem of time and relational observables. In response to a question by the audience, he admitted that little or none work had been done to connect LQG with the effective field theory results. I think that everyone came out of the conference with the agreement that this is an extremely important thing to do.

Abhay Ashtekar gave a summary of results in symmetry reduced models: loop quantum cosmology and "loop quantum black holes". He started arguing that while results in symmetric models do not prove generic validity, they cannot be dismissed a priori either; witness the example of the hydrogen atom spectrum predicted correctly from symmetric model, against the complexity of solving full QED. He next summarised the by now familiar results of LQC: the Big Bang singularity is replaced by a bounce, both in the zero and positive curvature cases. An important feature is that the correct semiclassical limit heavily constrains how ambiguities in the Hamiltonian are resolved. Similar bounces avoid the singularity in black hole spacetimes, showing that there is no information loss and that evolution is deterministic throughout the quantum regime into a new classical region. It is also known that these bounces are stable against small perturbations.

Carlo Rovelli asked a question at the end of Ashtekar's talk, one that has worried me for a long time, that I discussed briefly here about a year ago, and that has recently been discussed at Cosmic Variance (see previous post here for the link; I can't access CV now). In our universe, the Big Bang was a state of uncommonly low entropy; this ensures the existence of an arrow of time because entropy has naturally grown since then. If there was a collapsing phase and a bounce before the Big Bang, what was happening to entropy in it? Symmetry seems to demand it to decrease –but "naturally" a gravitational collapse increases entropy to a maximum, as in a black hole. The collapsing universe would need to be extremely fine-tuned for entropy to decrease in it.

I couldn't follow Ashtekar's answer to Rovelli, but later I found an oportunity to pose the question again to him in a coffee break. He said that while matter entropy is very difficult to analyze in the simple models that have been studied so far, gravitational entropy –the "likeliness" of the gravitational state- does indeed seem to behave symmetrically in the bounce models. The quantum regime near the singularity is a very special, intrinsically low-entropy state. I have been convinced by Frank (fh) in the discussion here a year ago that if this is so, the most natural description of the situation is not "a previously collapsing universe with decreasing entropy followed by an expanding universe with increasing entropy" but "a low entropy state that expands, increasing entropy, in both two time directions". In other words, it seems more natuural to define "positive time direction" at each of the two stages by the increase of entropy, even if this gives two different results and time is no more a "line" but a "double arrow". Surely, if there were observers in the (from our point of view) "collapsing"phase, they would take themselves to live in an expanding universe, if as it seems almost certain the psychological arrow of time is tied to the thermodynamical one. Ashetekar however, didn't seem to think much of this point of view (probably dismissing it as too philosophical). For him the scalar field that serves as "internal time" in these quantum cosmology models is the true "clock", and it is monotonically increasing.

I am still puzzled, however, about what happens with entropy in the closed universe model (postive curvature without dark energy). This one becomes under quantization cyclical, expanding and contracting again at regular rate. What happens when the apex of the expansion is reached? does entropy reverse itself suddenly, as in Gold's old cosmology? but how can this be, if the moment of maximum expansion is completely classical and localized systems should follow ordinary mechanical and thermodynamical laws without knowing about the cosmological turnaround? I find this very perplexing. A possible way out is that the existence of dark energy with its actual value, which accelerates the expansion and ensures that the universe is not cyclical, is somehow not an accidental but a necessary feature of the universe, so that the cyclical model will ultimately be shown to be inconsistent. But this is only a personal hope. See also my old review of Price's book on the arrow of time for more discussions of these questions.

Going on with the conference: Carlo Rovelli talked next about the new spinfoam vertex, an improved model that pretends to replace the Barrett-Crane one. He discussed at length the graviton propagator calculation he and his collaborators did a couple of years ago, explaining that since then the nondiagonal terms of the propagator had been computed and found to be wrong –but only because the Barrett-Crane model was used! Using the new model the problem is solved. The key difference is that second class simplicity constraints are imposed weakly rather than strongly. In the improved model the bondary states of spin foams match exactly the spin network states of canonical LQG, and intertwiner degrees of freedom remain free. (There were some technicalities about all these that I couldn't follow, but if you are interested download the slides and audio; it was a very clearly delivered talk.) The conclusion was optimistic: Carlo believes that this model may be the key for reconciliating the "canonical" LQG approach and the "covariant" spin foam one.

Paralell sessions

The talks I attended to this afternoon were mostly about highly technical aspects of LQG and spin foams, and I don't want to bore neither me nor you by writing much about them. I will comment only on two of them which were of special importance, to me at least. Kristina Giesel talked of the work she did with Thiemann on Algebraic Quantum Gravity, a new version of LQG which is defined in a purely "combinatorial" way; spin networks are abstract graphs and not embedded in any pre-existing manifold. Semiclassical analysis, however, can be done by specifying a 3-manifold and a classical phase space point in it, and constructing coherent states peaked on that geometry. The zeroth-order and first-order in hbar of the expectation value of the master constraint in these states come out correct; what is unknown is whether there are anomalies in M or whether 0 is in its spectrum. The second talk I want to remark upon was Eugenio Bianchi, on work related to the graviton propagator calculations. He showed computations of large scale area correlations in spin foam models, for boundary states peaked on a classical geometry, and showed that they agree exactly with those computed in perturbative Regge calculus. The point is that correlations calculated in a semiclassical state of the full, nonperturbative theory are here compared with correlations in the vacuum state of the perturbative theory around a corresponding classical solution. Finding agreement is a nontrivial check for the spin foam model. In this case the model was Barrett-Crane, but Eugenio thinks the results still hold in the "improved" model Rovelli had talked about.

And this is enough for today. The rest of the conference will be covered in one, or perhaps two, following post(s). As usual, stay tuned!

Labels:

Tuesday, May 08, 2007

Quantum Mechanics in words of one syllable

Some think Quantum Mechanics is impossible to explain in simple terms. To refute this idea I have composed this piece, which is inspired both by the “Theory of Relativity in words of four letters or less” and by “Philosophy in words of one syllable”. I have not used any kind of dictionary or thesaurus.


In this piece we will talk of small things and how they are. How small, you ask? Well, close to the scale of Planck’s h. This will mean the bits of stuff the world is made of: the bit of charge, the bit of light, and so on. These small bits are quite weird, not at all like the large stuff we are used to.

If you have a set of those bits, at each time it will be in a state. We write the state as a ket, as Paul taught us. The ket for state A looks like this: IA>. The ket has in it all one can know of the small bits we talk of. The state will change in time, of course. To say how much it will change per sec (its rate of change) we have a rule Er told us: i times the rate of change of the ket is H times the ket. H is a key thing; it keeps track of how much stuff is there is, the mass of each bit of stuff, which is the force that acts on each bit, and so on.

So we know how states change, fine. How do we get from this to a claim on what we will see in the lab? Here is where things are not like we are used to. In the large world we are used to, when we know the state of a thing we know that if we look at the thing we will see it in that state. But here, if we know the state we can’t tell in which state will we see it. We will have a chance to see it in a new state. If it was in IA>, it may be that we see IB> when we look. We can’t tell for sure. But we can tell what chance we have to see each state.

For that we use a rule that a guy called Max Born gave us. To get the chance to see state B when you look at some stuff that is in state A at that time, do some math called "ket A dot ket B", or IA> . IB>. What comes out of this math has to do with the chance to see B if the state is A. But it is not quite that, ‘cause the chance must be real and what we have now turns out to have i in it. Darned math! But just take the square of the real part and add to it the square of the part with i (that is, take the norm of what you have) and you will get the chance to see B if the state is A.

There are lots of things one can want to look at when one looks at stuff in the lab: Mass, charge, where bits are, how fast they move, and so on. For each of these things there are some states called “self states” of the thing. When the state is a self state of a thing, we can tell what we will see if we look at that thing. But in most states we can’t. As a case, if we add two self states IA> and IB> of a thing the new state IA> + IB> will not be a self state for that same thing. If this is the state, when we look at that thing we may see A or B, each with a chance of one half. Once we have looked, the state will be A or B; but not till we look.

This leads to the well known case of Er’s cat. If the state of some bits of stuff is IA> + IB>, and we put those bits in a box with a cat, and make things such that state A will in due time kill the cat, while state B lets it live, then the cat will not live nor die till we look in the box! Weird, huh?

I know you must want to say: “No way! For sure, the state was A or was B all the time, we just did not know which till we looked!” No such luck. If the state is IA> + IB>, then it may be that the A part and the B part “mix” and we can tell that the state was not “A or B” with a look at some things in which you see that mix. It is hard to do for a large thing, like a cat, ‘cause you would have to keep track of each bit of stuff; but it has been done for small things, and there is no clear line that breaks small from large.

Old Al did not like all this stuff. He was sure God did not play dice with the world. He and two pals came up with a thought to show all this must be wrong. Take two bits of stuff, say two light bits. Let the state be one in which both bits must have a thing not the same; say, one of them has “spin up” and one “spin down”, but the state makes not clear which has which spin. (Spin is like a turn ‘round that the bit may have; this turn may point up or down for each bit.) Let the two bits move a lot one each on its way, so they come to be far. Now look at the spin of the first bit. If you see it “up”, the spin of this bit has changed from an “up plus down” state to “up”, and the spin of bit two has changed at the same time from state “down plus up” to “down”. But how can the state of bit two change when we look at bit one, which is far from bit two? It makes no sense.

But it does. A smart guy called John Bell came up with a slick way to test this stuff, and it turned out that Old Al had been wrong for once. One is forced to grant that bits of stuff can “know” what goes on far, far from them, or else that in a sense things are not “out there” till we look at them! More and more weird, I say. It may be that some day we will make sense out of this. By now, guys are still not of one mind on what one should say. But, at the same time, with all this we can work out, know and grok lots of stuff. So it must be true. So when one starts to ask a lot, like “what does it mean”, some guys say: “Shut up and work!”

Coming next: Quantum Field Theory in words of one syllable. Quantum Gravity is much easier; one just needs to write: “What?”

Labels: ,


 
/body>