Tuesday, October 23, 2018

Loud black holes

Stephen Hawking's most famous contribution to physics was the proposal that black holes might actually glow, like any hot object even if it is perfectly black, when Einstein's General Theory of Relativity is combined with quantum mechanics. It's not just that black holes could be hot and glow, but that they are inherently hot (or at least a little bit warmer than absolute zero), with a temperature inversely proportional to their mass. It remains unclear whether Hawking's theory is actually true, but even just as a hypothesis it has inspired a generation of theoreticians, because it suggests some kind of profound connection between relativity, quantum mechanics, and thermodynamics. 

We haven't yet been able to test Hawking's hypothesis astronomically, because even though we've found things far out in space that we're sure are black holes, the thermal glow they would have if Hawking were right would still be too faint to see. If we could get a little black hole in a lab here on Earth, we might make revolutionary observations ... but we probably shouldn't try this.
 
Let's not do this.


An idea from William G. Unruh might let us study black holes here on Earth in a safe way, however, by simulating them with fluid dynamics. What Unruh pointed out is that sound waves in a flowing fluid behave mathematically just like light waves in the curved spacetime of General Relativity. The sonic analog of a black hole—a sonic ergoregion—is a region in which the fluid itself is flowing faster than its own speed of sound. Sound waves can pass into this region, but cannot get back out again, because the supersonic fluid drags them along faster than they can run.

Unruh's fluid analog idea has been pursued by quite a few people over the years. Back in 2001 Luis Garay, Ignacio Cirac, Peter Zoller and I initiated the theoretical study of how to make sonic black holes in Bose-Einstein condensates. BECs would be great candidates, we thought, because they are very controllable (so we might get a supersonic flow going), they are superfluid (so no friction to slow down the flow), and they are quantum mechanical (so Hawking's effects might show up). Other people have greatly extended this work, and just a few years ago Jeff Steinhauer actually did some experiments on supersonic condensates and found quantum-correlated sound waves coming out of the sonic black hole, very much like the quantum radiance of Hawking's theory.

"Beyond the mountains, more mountains" is a Haitian saying. Now that we've seen Hawking effects in a lab here on Earth, we have new questions. One concerns a simple geometrical issue. Our 2001 work assumed a one-dimensional model for simplicity, and everyone after us seems to have maintained this convention, even up to Steinhauer's experiments, which use long, thin clouds of ultracold gas. A subtlety that we and everyone else seem to have ignored, however, is that Unruh's analogy between fluid flow and curved spacetime really needs at least two spatial dimensions. In 1D it only kind of works, roughly.

There are other good reasons to look at black holes in more dimensions as well. Bekenstein's tantalizing concept of purely geometrical entropy in black holes needs at least two dimensions. So in the past year we finally bit the computational bullet and did some theoretical simulations of two-dimensional sonic black holes. What we found is an instability that doesn't show up in one dimension: the formation of vortices. Our sonic ergoregion became a turbulent sea of quantized whirlpools. It generated plenty of sound waves, but not the kind of sound predicted by Hawking.

Vortices and sound waves from a sonic black hole. The power spectrum is not thermal.
So is this bad news for analog black holes? That depends on how you look at it, because the analogy between fluid flow and curved spacetime only ever works as far as mild sonic ripples. Once the fluid starts doing violent things like forming vortices, the analogy breaks down. There's no reason to expect nonlinear quantum fluid dynamics to reproduce nonlinear quantum gravity.

So one interpretation of our results would be to say that the appearance of vortices marks the end of the experiment, and only everything up to that point counts as a black hole simulation. As long as we only have low-amplitude sound waves moving through a smooth, steady flow, Unruh's analogy still works, and Hawking's theory should still be confirmed.

The problem with this, though, is that the low-amplitude regime in which everything works is not the regime that holds mysteries. We don't need cutting-edge experiments to solve linear equations, and anyway there was never any doubt about Hawking's linear calculations as far as they went. The big question is whether black hole thermodynamics survives in fully nonlinear quantum gravity. 

Unruh's analogy was never supposed to go beyond the linear regime, though. So why have we even been bothering with analog black holes at all, if they are inherently incapable of answering our only real questions? 

Well, when it comes to quantum gravity, we are desperate. We've stopped hoping for reliable answers. We'd be glad if we could just get some hints. And although nonlinear quantum fluid dynamics certainly won't reproduce nonlinear quantum gravity, it does at least give us a real, physical system which has nonlinear quantum dynamics and can also look like curved spacetime in some situations. 

We don't actually know anything more than that about real quantum gravity. Recent calculations based on string theory have suggested that real black holes out in space may be quantum "fuzzballs" that are quite different, inside their ergoregions, from the classical black holes of General Relativity. Our turbulent ergoregions do happen to look something like this.

And that may well be a meaningless fluke. It still means that experiments on sonic black holes in Bose-Einstein condensates will be able to put some data points on the big empty graph of nonlinear quantum systems with even limited resemblance to General Relativity. We know so little about quantum gravity, and it would be so wonderful to learn about black hole thermodynamics. It's well worth listening to what the sonic black holes have to say.

Interested readers can find our paper in the open-access online journal New Journal of Physics. (NJP is a joint venture of the Institute of Physics and the Deutsche Physikalische Gesellschaft. It's a mainstream peer-reviewed journal that is one of the standard examples to show that not all open-access online journals are scams.)


Wednesday, August 23, 2017

Tiny Quantum Engines

The previous post mentioned some work on minimal mathematical models for combustion engines, and ended with a link to our first published paper on this subject. That's the paper in which we introduce the class of dynamical systems that we call Hamiltonian daemons. The Hamiltonian part of this name refers to William Rowan Hamilton's formulation of classical mechanics. Hamilton's restatement of all Newton's F = ma stuff dates from 1833 but is still the dominant dialect in physics today, mainly because it adapts well to quantum mechanics.

The daemon part of our name for our systems is partly meant to recall Maxwell's Demon. The devout Presbyterian Maxwell wasn't thinking of an evil spirit: he meant the kind of benevolent natural entity that was the original meaning of the Latin word daemon. Maxwell's daemon was an imaginary tiny being who could manipulate the individual atoms in a gas.  Physicists still think about Maxwell's daemon today, as a way to express questions about the microscopic limits of thermodynamics.

Our daemon name is also partly in analogy to Unix daemons, which are little processes that operate autonomously to manage routine tasks without user input. Hamiltonian daemons are mathematical representations of small, closed physical systems which, without any external power or control, can exhibit steady transfer of energy from fast to slow degrees of freedom. In this way they are ultimate microscopic analogs of combustion engines, and learning about them may teach us about the role of thermodynamics on microscopic scales.
A simple Hamiltonian daemon: like a tiny car driving uphill. 
(The figure in our published paper does not have the little 
pictures of cars. Physical Review is too serious for that.) 

To illustrate how daemons work we took a simple model that resembles a tiny car that tries to drive up an infinite hill. The car's engine has to start in the old-fashioned way of getting it turning by pushing the car, so the car starts out already rolling uphill at some speed. Since it's going uphill, it slows down; some of the time (the dashed curve), the engine fails to ignite and the car just keeps slowing down until it rolls back down the hill. 

Other times, however, the engine catches (solid curve). The car then keeps on driving uphill at a steady speed, expending high-frequency fuel. Since our simple hill is infinite, when the fuel runs out the car will eventually roll back down, but if we add the complication of making the hill level off at some point, the car can climb to the top and then drive on to some goal. 
Schrödinger's Cat 
as a power source?

More recently we thought, "Hey, really small engines should be quantum mechanical." What would happen if you tried to extract work from a fuel-powered engine that was entirely quantum? What would a quantum Hamiltonian daemon be like? Well, we found out. 

It turns out that the quantum daemon behaves rather like a random daemon. Sometimes the engine ignites, and sometimes it doesn't—even if you do everything exactly the same every time. The quantum daemon also burns its fuel quantum by quantum, and it keeps driving uphill by giving itself a series of kicks that suddenly bump up its speed. After every such kick, there is a chance that the quantum engine will spontaneously stall, and refuse to run further, even if there is fuel left. So over time the quantum probability distribution for the height of the car develops a series of branches, as in the figure here.
The quantum daemon burns fuel in quantized steps, 
and can randomly stall.

It's not just randomness going on at every kick, though. If you look closely you can see interference fringes in the probability distribution. Since the whole daemon system is closed, the evolution is actually all unitary. Every branch of the figure, which represents a different possible trajectory of the car, is in coherent quantum superposition. Since there is more fuel left if the engine stalls sooner, but less fuel if the engine goes on longer, the superposition of all the branches is a total quantum state with high quantum entanglement between fast and slow degrees of freedom. Since the number of branches grows over time as the quantum daemon operates, the von Neumann entropy of the slow degrees of freedom increases.

Readers who are familiar with quantum mechanics may want to read more in Physical Review E 96, 012119 (2017), or in the free and legal ArXiv e-print version. Our first paper on daemons (Phys. Rev. E 94, 042127 (2016)) is also on ArXiv.

Friday, January 13, 2017

Humphrey Potter and the Ghost in the Machine

Humphrey Potter adds strings to Mr Newcomen's engine so that he can go play.
The first steam engines were slow-working beasts that needed constant human tending. To work its minutes-long cycle, the first Newcomen steam engine needed a human hand to open and close valves for cold water and steam. The machine provided great power, but someone had to control it.

At some point, however, some brilliant mechanical mind noticed that the machine was, after all, creating its own motion. Why not connect the moving piston to the valves, and let the engine run itself?

Some sources even claim to identify the person who first made this brainstorm work: a young boy named Humphrey Potter, who was paid to operate a Newcomen engine by hand. Young Potter hooked up a system of “strings and latches” that made the machine itself do his work for him. Then he ran off to play.

Apparently the sources for this story are not considered reliable. I can’t find any of them myself, and the modern texts that mention Humphrey Potter refer to his story as a legend. That’s a shame, because I’d like to think I could put a name to the person who first made a useful engine run all by itself. 

Apart from the practical advantages of getting a machine to run itself, the scientific point of lazy Potter’s clever trick is that it took any form of intelligence out of the loop of doing work. The entire engine process, from fuel combustion to pumping water, was now a purely mechanical operation. Humphrey Potter banished the ghost from the machine. He made it clear that everything that was occurring, in the marvelous process that turned lumps of coal into useful work, was occurring strictly under the basic laws of physics. It all ran all by itself.

In one way we take this insight for granted now, and may even extend it to processes more complex than lifting weight by burning coal, processes like life and consciousness. Yet in a very practical sense science still has not fully taken the point that engines can run as closed systems, without external power or control, and without any ingredients beyond basic physics. Engineers invoke higher level concepts like pressure and temperature, and while these are clearly valid, they leave a lot of details hidden under the hood. Theoretical physicists analyze parts of the whole process, like how gas particles adapt their motion when a piston moves in a predetermined way, but they do not simultaneously consider how the piston motion is itself determined by the gas particles bouncing off it. We don’t really believe there is any ghost in the machine, but when we have to explain how the machine works, somehow we keep sneaking the ghost back in, in some form or other.

If there's something strange
in your phase space neighborhood ...
At least, until now. Quite recently we have discovered a very simple mathematical model for a minimal kind of combustion engine, that runs as a closed system under basic physics. So we now have a bare-metal, first-principles model for an engine. Its operation is in some ways very reminiscent of a steam engine, but in other ways it is radically different. We are hoping that it may teach us about the microscopic roots of thermodynamics, but someday, perhaps, it might be the basis of a whole new class of power nanotechnology.

Readers who know Hamiltonian mechanics may enjoy our first paper on this subject. "Hamiltonian analogs of combustion engines: A systematic exception to adiabatic decoupling" is published in Physical Review E 94, 042127 (2016), and is also available in e-print form at https://arxiv.org/abs/1701.05006.

Wednesday, October 1, 2014

Newton versus Heisenberg

Death of a microbe
The Heisenberg Uncertainty Principle has become famous well outside quantum physics, but it is often cited in garbled form. I once found it mentioned in a textbook on social science research methods, where it was defined as the fact that electron microscopes can damage samples when they observe them by electron bombardment. Physicists roll their eyes at such misunderstandings, but I think that physicists themselves often misunderstand the Uncertainty Principle, by confounding Heisenberg’s specifically quantum mechanical relation with a principle of measurement that goes back to Newton. 
What the Uncertainty Principle really means, in practical terms, is that if you construct an apparatus to control one experimental property more tightly, so that the variations in its value between different repetitions of the experiment will be smaller, then past a certain point there will always be some other thing that becomes correspondingly less well controlled, so that its run-to-run variations become wilder. The ‘certain point’ at which this trade-off sets in represents a degree of precise control so high as to be quite unattainable anyway in the macroscopic world. So although the Uncertainty Principle is profound, it is really irrelevant to fields like social science research.

Observing something means letting it act on you.
What even many physicists think the Uncertainty Principle means, however, is something that really is widely relevant: the fact that no measurement can ever be purely passive, but always affects the thing being measured. This principle is both true and important, but it is not specifically quantum mechanical.

Observation is a physical process. A meter can only register the position of an object if there is some interaction that makes the object’s position act on the meter. Newton told us, long before Heisenberg, that this means that any meter is also going to react upon the the thing it measures. Such ‘observer effects’ are apt to be important when large meters measure tiny things; the Heisenberg Principle, however, is not this, but an additional complication in microscopic measurements.

In the early days of quantum mechanics, critics of the new theory tried to argue that the Uncertainty Principle was not self-consistent, by describing hypothetical experiments that would obey the Principle in each individual process, but yet still lead to an indirect violation of the Principle as an end result. These arguments all had subtle flaws, and the most famous flaws involved reaction effects. Thus the only really solid connection between the Uncertainty Principle and measurement reaction is historical.

Newton observes Heisenberg stealing credit for his ideas.
Newtonian reaction in physical measurements is a distinct concept from Heisenberg Uncertainty, but it does at least seem that one must get the former right in order to understand the latter. So perhaps there really is some deep connection between them. Until that connection comes to light, however, anyone who wants to relate observer effects in general to a basic principle of physics should really be citing Newton, not Heisenberg.

Monday, October 28, 2013

Opposites: Open and closed

Are they really so opposite?
As Niels Bohr used to say, the opposite of an ordinary truth is a falsehood, but the opposite of a profound truth is another profound truth. I offer this opposite pair:

1) All systems are open.
2) All systems are closed.

A closed system is a set of physical things which can be regarded as isolated from the rest of the universe. An open system, in contrast, is affected by things outside itself, even if those things are not directly observed. So these statements are certainly opposite. How are they both true?

What defines 'the system'?
Experiments try to isolate variables, but we can never achieve perfect isolation. Vacuum chambers are made of steel walls, and over time a few stray gas atoms always percolate in and out of tiny cracks or pores in the steel surface. No laboratory building is perfectly insulated from vibrations. High energy cosmic rays can pierce any barrier; and so on. It may be possible to achieve isolation that is excellent for all practical purposes, but all physical systems are open, strictly speaking. 

If we really want to speak strictly, however, then the very concept of a ‘system’ is inherently an approximation. There is really only one system: the universe. The universe as a whole is closed by definition, so all systems are closed. Of course, it is no less impossible in practice to describe the whole universe than it is to seal off a portion of the universe in perfect isolation from the rest. It is often possible, however, to describe a very large closed system.

And indeed this is precisely what we normally do, to identify the distinctive physical features of an open system: we analyze a large closed system, and then discard all the information that does not refer directly to the small sub-system that represents our ‘open system’.

So any system is open, if we want it to be: it is only a matter of how low we set our threshold for ignoring slight influences from external factors. Conversely, however, any system is closed, if we want it to be: it is only a matter of how large we are willing to make our system, to bring relevant external factors within its frame. The distinction between open and closed systems is an important one, but it is not a distinction between two different ways things can really be. It is a distinction between two different ways of thinking about things. Both ways can be good ways of thinking. Both truths are profound.


An engine would still run inside a large box.
It seems to me that too many physicists today have lost sight of the second truth, however. The most profound mystery that physics still faces is the origin of irreversibility. We don't understand why we can't remember tomorrow. And whatever is going on in quantum measurement, it seems to be an empirical fact that all quantum measurement devices rely crucially on thermodynamically irreversible processes to achieve their extreme amplification. No-one can find a clear explanation of irreversibility within closed-system Hamiltonian mechanics, but few people want to accept that our mighty science is still stumped by such a basic question after a century of breakneck progress, so most people like to think that the open system generalization must be the simple solution.

Open systems can't be the basic explanation of irreversibility, because all systems are also closed. Whether or not a system is open is not a physical fact, but an arbitrary choice of perspective in deciding what to include within the system. So the openness of physical systems cannot make a fundamental difference to anything; anything that can be explained as an open system must also be explicable as a larger closed system. A steam engine would still run, at least for a good long time, inside a big impermeable box.

Tuesday, February 19, 2013

A Cup of Heat, Monsieur?

Pierre-Simon de Laplace and Antoine-Laurent de Lavoisier
thought that heat was an invisible liquid.
18th century physicists thought of heat as something much like electricity: an invisible fluid that could flow through other materials or soak into them. They named this hypothetical fluid ‘caloric,’ and it was thought to be a distinct form of material, like air or water, only different. If an object had absorbed a lot of caloric, like a sponge soaking up water, then it was hot. If the caloric drained out, the object cooled.

This was a sensible theory. We all know that objects can become electrically charged and that this changes their properties. A charged balloon can stick to the ceiling, or make your hair stand on end; a charged metal sphere can give you a zap. Electric current really is an invisible fluid, composed (most usually) of material particles called electrons. Most of the time they are bound up in atoms, but when they come loose they can flow into things, or out of them or through them. Things can become charged by soaking up extra electrons.

Soaking up electric charge. 
(Image by Wikimedia user Dtjrh2
used under Creative Commons license.)
In a similar way, it would seem, objects can also soak up heat. Hot objects may expand or cause burns, just as charged objects stick to ceilings or shock people. Heat flows from high to low temperatures, just as electricity flows from high voltage to low. Appealing as it is at this basic level, the analogy turns out to work well even in finer detail. Antoine Lavoisier and Pierre Laplace developed an extensive body of caloric theory that was able to explain heating and cooling and many other thermal phenomena with quantitative accuracy. 

So in the eighteenth century it only made sense to think of heat as something similar to electrical charge. In the following century, heat engines and electric motors would be developed in parallel, and engineers would still think of them in similar terms. Today again people are deciding whether to have a car powered by an electric motor instead of a combustion engine, and the differences seem to be ones of practical detail. 

Today we no longer believe that heat is a material fluid, however. Why not? It's not really as clear-cut an issue as textbooks often make it seem, because today our concept of matter is not as simple as it once was. We know that not even electrons are really these indestructible little specks of hard stuff: they can be created and destroyed in high-energy collisions. And in a lot of ways we still treat heat as if it were a material fluid.

The bottom line, though, is that even though electrons can be born and die, electric charge can't. If electrons appear or disappear they do so together with positrons, so that the net change in charge remains zero. The only way for an object to become charged is for charge to flow into it from elsewhere—or for opposite charge to flow out of the object. There is no way to simply create charge from stuff that is not charge. Heat, in contrast, can flow into or out of things—but that's not the only way to get heat. One can also create heat, without importing it or already having it. It's called friction.

Rub your hands together. Feel it? That's heat. 

You didn't just create any new material substance from nothing. Big Bangs and particle colliders aren't as easy as that. So heat is not a material fluid. What is it, then?

Whatever heat is, you've just made some. It's right there in your hands.

Thursday, February 14, 2013

Fire Glows

It's not just bright.
Humans discovered fire a long time ago, but for most of that time we only used it for warmth and light and cooking, rather as Bilbo Baggins used his magic ring for years just to avoid unwanted callers. Only in the 18th century did James Watt show up to play Gandalf, and reveal that our curious little trinket was the One Ring to rule them all. Fire has enormous power.

Even after centuries of technological progress since Watt, we still find it very hard to beat combustion as a source of power. Burning a tank of fuel releases enough energy to lift cargo all the way to the Moon, even with the horrible inefficiency of a rocket engine. Combustion provides energy, as one says, to burn. Why is fire such a tremendously greater power source than, say, clockwork springs or a windmill? I’ve never seen a clear answer to this question in any physics text, but I think I have found a succinct one of my own. 

Fire glows.
Light oscillates really fast.
The fact that fire glows demonstrates that fire is releasing energy from motions (of electrons in chemical bonds) with frequencies in the range of visible light. Those are very high frequencies, around 1014 cycles per second. As Planck taught us, energy is proportional to frequency. So if human energy needs are for motion at up to a few thousand RPMS, mere hundreds of cycles per second, combustion lets us tap energy resources on a scale greater by a factor of a million million. Combustion delivers so much energy, because molecular frequencies are so high.

This is what an engine somehow does.
It isn’t easy to gear all that power down by a factor of 1012 so we can use it, though. Electrons whir around in molecules far too fast for our eyes to follow. We can’t just throw a harness over them. Even if we could, they are very light in weight. They bounce off things, rather than dragging them along. To tap them for power, we need some clever way of gently bleeding off their enormous but very rapidly whirring energy, a tiny bit at a time.  There's more to it than just installing an awful lot of tiny gears. 

Getting fire to do work means transferring power across a huge frequency range. That's what thermodynamics is all about. The reason that thermodynamics doesn’t seem very much like the rest of physics is that energy transfer across a huge frequency range is an extreme case, in which certain otherwise obscure aspects of physics become very important. That makes them important in general, though, because high frequencies can deliver so much power. It's well worth learning how thermodynamics really works.