In a happy example of interdisciplinary synchronicity, both SciTech Daily and Arts and Letters Daily have links to a Scientific American article about tests of Special Relativity. With both artists and techies recommending it, how can I pass it up as blog fodder?
The article describes a couple of different experiments designed to look for hints of exotic physics beyond our current models of the Universe, by looking for places where Einstein's most famous theory breaks down. Not the "E equals m c squared" bit (though that's the graphic they chose to illustrate the piece), but rather the bits about Lorentz invariance. This is the mind-bending stuff about time slowing down and objects changing size when you travel at speeds close to the speed of light-- it's provided the material for countless pop-science books and tv shows, and made many a dazed undergraduate say "whoa" in a Keanu Reeves sort of way.
The experiments in question are fascinating, and each is a technical tour de force in its own right. Happily, some of the experiments also happen to be carried out by friends and colleagues-- in this case, Nathan Lundblad works in the Laser Cooling Group at NASA's Jet Propulsion Laboratory (JPL), with Bill Klipstein and Rob Thompson, who I worked with at NIST, back in the day. (The group also happens to be headed by the brother of a current colleague... Small world...). As one would hope from the involvement with JPL, the ultimate goal of the experiments is to propel some things with jets, specifically to put sensitive atomic clocks into orbit on the International Space Station. There are two main experiments, outlined nicely by NASA to save me a bunch of typing: one involves comparing clocks in space to clocks on the ground (it goes by the acronym PARCS), the other involves putting a laser-cooled rubidium atomic clock in space, to see if it runs at the same rate as a microwave frequency standard which should be orbiting at the same time.
The article also highlights the work of a group at Konstanz in Germany, where they've performed fundamental tests by comparing the frequencies of two atomic clocks over a long period of time, and by repeating the famous Michelson-Morely Experiment (famous, of course, because Morely was a Williams grad...). These experiments are a little less speculative (they don't require blasting the apparatus into space, for one thing), and have already produced results.
One of the most interesting things about these experiments, and the thing that made this article particularly attractive as blog fodder, is that these are all essentially recapitulating classic experiments of physics. The Konstanz group is explicitly repeating (and updating) Michelson-Morely, and NASA's description of the RACE experiment highlights that aspect of it as well. Other experiments mentioned in the article are just re-checking the previously observed relativistic effects on rapidly moving bodies, and re-confirming the fact that gravity affects all masses equally. This sort of thing tends to be somewhat surprising to non-scientists-- after all, if we've done the experiment once, why repeat it?
It's a question I get asked a lot when friends and relatives ask about my work, and it's the same sort of question that leads to the faintly surprised tone of the links to the article (and the opening paragraph of the article), and answering it is the main point of the whole thing:
After a century, Einstein's special theory of relativity, which describes the motion of particles moving at close to the speed of light, has held up remarkably well. But as scientists probe the edges of the current knowledge of physics with new tests, they may find effects that require modifications on the venerable theory.
Several current theories, designed to encompass the behavior of black holes, the big bang and the fabric of the universe itself, could lead to violations of special relativity. So far, recent, updated versions of century-old experiments show no signs that Einstein's vision is reaching its limits. Various tests are ongoing, however, and a new generation of ultraprecise, space-based experiments is set to launch in the next few years, offering some chance, however slim, of observing signs of the laws that will eventually supersede relativity.
It should come as no surprise that people continue to test Einstein's theories after a whole century, because people continue to test Newton's Law of Gravitation three centuries after Newton more or less started the whole field of physics. A group at the University of Washington has done an absolutely astounding series of experiments, demonstrating that the basic expression of the force of gravity (proportional to the product of the two masses, and inversely proportional to the square of the distance between them) holds on scales from the cosmological (attraction between distant galaxies), to the nearly microscopic (separations of less than a millimeter).
The point of testing Einstein's theories is the same as the point of testing the theories he overthrew. It's not exceptionally likely that you'll find a deviation from the accepted theory, but if you do, it's big news. And in a very fundamental way, science extends only as far as our best measurements. This isn't another statement of uncertainty-- just a statement of fact. There's always some theorist, whether a respected member of the scientific community, or just some wing-nut cranking out mimeographed copies of his Theory of Everything in a basement in Kansas, who will boldly predict that Newton and Einstein were wrong, and the proof of the Theory of Everything lies just beyond the scientific lands we know. Relativity will break down when you go at just the right speed, or gravity will change its behavior when the masses are a hundredth of a millimeter apart, or Invisible Pink Unicorns will suddenly become visible and cavort about the lab if you do some experiment that you can't quite do just yet. Until and unless you make the measurement, you'll never know for sure.
When I was an undergrad, I did some semi-official tutoring for some friends who were taking the stereotypical "Physics for Poets" class, taught by a wonderful professor, a slightly absent-minded theoretical physicist who lacks only a German accent and a white lab coat to be the very image of the popular idea of a physicist. He introduced the lecture on gravity, they told me, by saying "The great thing about gravity, is that every time you drop something, it falls," and dropping a piece of chalk into his hand. "At least," he went on, "you think that every time you drop something, it's going to fall." And dropped the chalk into his hand a few more times, as if waiting for it to take off an rocket through the ceiling.
It got a big laugh from the students (in a slightly nervous, "my-professor's-a-lunatic" sort of way), but it makes a very valuable point. Until you actually do the measurement, you never really know. That's why, as scientists, we keep repeating old experiments, testing venerable theories, and dropping pieces of chalk. Most of the time (every time so far), you get exactly what you expect-- the results are the same, the theory is upheld, the chalk hits the floor-- but until you do it, you never know. And the one time the chalk flies off through the ceiling will stand the whole world of physics on its head.
Superior Cultures Don't Need No Reading Comprehension
While we're on the subject of education and Instapundit (and his readers), from a later post, we have:
Reader John Kluge offered a different explanation:
I would be very curious to see what the gender breakdown is among whites and Asians in college versus blacks and Hispanics. Just a guess but I bet the ratio is pretty close to fifty fifty among whites and Asians and much more disproportionately female among blacks and Hispanics. Its an important statistic that was left out of the Washington Post article. Is the problem in colleges a problem with men in general or a problem with black and Hispanic men going to lousy schools, living in a lousy culture that doesn't value education and consequently increasingly falling behind the rest of society? Its an important distinction and no one seems to be picking up.
Meanwhile, the third paragraph of the Washington Post story reads:
At colleges and universities across the United States, the proportion of bachelor's degrees awarded to women reached a post- war high this year at an estimated 57 percent. The gender gap is even greater among Hispanics -- only 40 percent of that ethnic group's college graduates are male -- and African Americans, who are now seeing two women earn bachelor's degrees for every man.
OK, that's not the whole picture, but combined with the information that blacks and Hispanics each make up about 8% of those earning bachelor's degrees (guesstimated using populations and graduation rates for the various ethnicities), you can work out fairly quickly that the white-and-Asian percentage isn't far off the overall average. To give a concrete example, 43 of every 100 students graduating Kluge University are male, three of those men are black, and three Hispanic (we're rounding 33% of 8 and 40% of 8 both to 3). Which means that 37 of the male students are white or Asian, and 37/84 (84% of students are white or Asian) is 44%, not too far from 43%...
So no, it's not all the fault of the colored folk and their lousy culture...
(The web is swimming in education statistics, including the Digest of Education Statistics and an American Council on Education report relevant to this issue. If your goal is to fact-check the asses of the major media, you do occasionally need to do some reading and a little math, but that shouldn't be a problem for a member of a superior culture...)
This is a nice demonstration of one of the great problems of the race debate in America. The real problem here is almost certainly economic in origin-- men from poorer backgrounds either don't go to college, or don't graduate-- but everybody insists on framing this in terms of race. Knee-jerk liberals point to the fact that the poor are disproportionally minorities, and blame everything on racism, while reactionary right-wingers point to the poor statistical performance of minorities, and blame everything on the "lousy culture" of blacks and Hispanics. Both ignore the fact that students from minority backgrounds are (gasp!) a minority of all students (and poor minority students are a smaller minority yet), and the larger problem here is systematic and cuts across all races.
Boys Will Be Boys
One of the latest tempests in the teapot of the "blogosphere" is the article in the Washington Post showing that 57% of all bachelor's degrees are awarded to women. Any comments I make on this are likely to be drowned out as people move on to school vouchers, but I'll throw my $0.02 in anyway.
Regarding the Post article, the always-enlightened Instapundit writes:
SEX DISCRIMINATION IN COLLEGE: 57 percent of degrees are going to women. There's a lot of hand-wringing about why, but they miss the obvious: over the past 20 years there has been a concerted effort to make colleges male-unfriendly environments, with attacks on fraternities, with anti-male attitudes in many classes, with intrusive sexual-harassment rules that start with the assumption that men are evil predators, and so forth. Now men don't find college as congenial a place. It's a hostile environment, quite literally.
How come none of the experts quoted in this article has noticed that?
Maybe I'm missing something here, because I just don't see the need to invoke Eeeeevil feminazis.
Let's look at what these rates would mean in the worst-case scenario where the entering classes were 50/50, and all the discrepancy was caused by driving male students out. For good schools (those in the upper bracket of the US News rankings), the graduation rate for entering freshmen is something like 80%. That means that out of 100 students, 50 male and 50 female, 20 don't make it to graduation. For the male-female split to end up at 43-57, that would mean that sixteen of those twenty would have to be male.
Sounds pretty bad, no? Curse those man-hating McKinnonites! But is a 4:1 ratio in the male:female washout rates unreasonable? Anecdotal evidence proves nothing, but I'll throw some out anyway. I went to one of those top 50 US News schools, graduating in 1993, and of the people I knew in college, I can think of two women who were asked to take a year off, both for academic reasons. (The percentage of students graduating within six years of entering is one of the US News ranking stats, so colleges tend to suspend students for a year, rather than kicking them out, in hopes that they'll get their act together and manage to come back and graduate in time to keep the statistics rosy). I can think of at least eight men who were asked to take a year off, and I'm probably forgetting a few. 4:1 is probably not that far off, on the discipline side.
Surely, then, they must've been run off through "male-unfriendly" policies, to account for the 4:1 difference in discipline rates? Nope. Two were academic casualties, while the others were run off for a variety of reasons including: setting off fire extinguishers for fun, setting fire to a dorm, breaking into the Faculty Club to steal booze and bowl a few frames (there's a candlepin bowling lane in the basement-- no lie), being the roommates of the guy who was caught breaking and entering and getting busted smoking pot when the cops came by to search the room for more swag, running across a busy street naked in the midst of a major acid freak-out, and tacking threatening racist messages to the door of the Black Student Union in an effort to "spark a discussion of racial issues" (the student in question was an African-American. No, that never made any sense to me, either...).
(Most of these people came back and graduated a year or two late, but then this was a school that prides itself on a 95% graduation rate for its entering classes-- at a school with less riding on the US News rankings, some of them probably would've been tossed for real. The few students who didn't return after their semi-voluntary leaves were all male.)
What's the moral here? You don't need a nefarious anti-male agenda to preferentially drive male students out of college, you just need an anti-idiot agenda. It's entirely possible to get differences in the gender populations simply from the fact that male students are more likely get up to the sort of shenanigans which result in property damage and/or potential academic sanctions than their female counterparts (not that the women were little angels, or anything-- I knew most of the women's rugby team, fer Chrissakes-- they just did less damage than the men, and got punished at a correspondingly lower rate). You'll note the absence of "date rape" cases, and I'll also point out that fraternities were disbanded in the mid-60's at the school in question, so it's not Dean Wormer oppressing the free-spirit frat boys, either.
That's not to say that there aren't real gender issues in education-- the male/female split in the general undergraduate population is running close to 45/55 these days, and that's not a product of differential discipline at the college level. But it's hard to see how an "anti-male" campus atmosphere could be responsible for that, either, since most high school students don't have a clue about campus atmosphere before they arrive, so the "anti-male" paranoia needs to be pushed back into the high schools and junior high schools, which gets progressively less credible. The real story is probably a matter of social and econoic factors-- as a person who attended a rural public high school in Central New York, I can say that probably 50% of my high-school graduating class never set foot on a college campus, and male students (even fairly bright ones) were far more likely to give college a miss and move directly into the working world (or VoTech programs, or the military, or some other path that doesn't lead to a bachelor's degree) than female students, simply because they needed the money (and, to a lesser extent, from a sort of manly-man pride thing-- they felt more pressure to support themselves immediately). Female students were more likely to be sent to college, because there were fewer reasonable job options for them without a degree of some sort. It's not feminist indoctrination, it's economics.
Silicon Quantum Computing?
If you want to build a quantum computer that will be more than a laboratory curiousity, there are four major requirements you need to fulfill or issues you need to deal with:
- Decoherence: This is a tricky one, and has to do with the fragility of superposition states, and entangled quantum states. As a general rule, the more your quantum system interacts with the environment, the faster these states fall apart (or "decohere"). If they fall apart faster than you can do the calculation you want to do, you're SOL. So you need a system which is only weakly coupled to its environment.
- Entanglement and addressability: On the other hand, you need quantum entanglement to make the computer work, and you need to be able to entangle arbitrary pairs of "bits" together. This requires strong and controllable interactions between the systems which act as your "bits." Yes, this seems to conflict with the previous requirement-- finding systems with slow decoherence but strong entangling interactions is a major problem in quantum computing.
- Readout: Once you've got a computer, and you run it through your calculation, you need to be able to read the answer out without making any mistakes. This may seem trivial and obvious, but finding a way to read the states of single quantum systems with near-100% effciency is anything but trivial. It also tends to require a means of strongly interacting with the individual bits, which goes back to the decoherence problem again.
- Scalability: If your computer is going to be of any use to anyone, it needs to be able to handle big numbers. The numbers used in cryptography, for example, tend to be products of prime factors with something like a hundred digits each. That's a lot of bits, and still more are needed to do things like error correction (which helps deal with the problem of decoherence as well).
So, in summary, you need a scalable array of a large number of systems with strong and controllable mutual interactions, but weak coupling to the environment, and a read-out scheme with 100% quantum efficiency. This is, in technical terms, a pretty tall order. A large number of candidate systems have been put forward, as I noted yesterday, but most fall short on at least one of these points.
Taking a few of the major candidates, in no particular order:
Trapped Ions: Probably the best developed of the current schemes, based on years of pioneering work by Dave Wineland's Ion Storage Group at the National Institute of Standards and Technology (NIST) (full disclosure: I used to work for NIST, though in Gaithersburg, not Boulder (where Wineland's group is)). Their "bits" are single ions of beryllium held in an electromagnetic trap and laser cooled to extremely low temperatures (two different energy states of the ions serve as the "0" and "1"). This is a beautiful scheme in a number of ways-- the ions can be individually addressed with relative ease, they can be coupled together arbitrarily by using the collective motion of all the ions in the trap as a sort of "data bus," and the readout is easily done by laser fluorescence. They've demonstrated the ability to create arbitrary quantum states with the ions, they've made entangled states and "Cat" states, and have demonstrated simple quantum logic gates (the building blocks for a quantum computer).
The big problems facing this scheme are scalability and decoherence. It's difficult to trap really large numbers of ions, and hard to envision quite how to store and transport enough of them to do real computations. They've also been plagued by a mysterious heating of the ions in the trap, which may cause decoherence problems. Last I heard, they thought they were starting to get a handle on that, though.
Trapped Neutral Atoms: This is one of the more speculative of the possible schemes-- I include it mostly because I used to work with some of the people pursuing this (Ivan Deutsch and Poul Jessen), and what's the point of having a weblog if you can't hype your friends?
The idea here is to use neutral atoms trapped in an optical lattice (a sort of "crystal" structure made by trapping atoms in a set of intersecting laser beams) as the "bits" for the computer. It offers some distinct advantages-- neutral atoms interact less strongly with the environment than ions do, so the decoherence problems should be smaller, and the lattice further isolates them. It's also relatively easy to get a large number of atoms into a lattice, and collisions between atoms in neighboring sites can be used to do the necessary operations.
The major issus here are readout and addressability. It's not clear how to address specific pairs of atoms in the middle of the lattice, nor is it clear how to read the final states out. Still, it's a promising area for basic research into quantum phenomena, even if it's unlikely to provide the mechanism for the next generation of code-cracking.
Liquid State NMR: This is one of the better developed techniques, but also far and away the most controversial. The idea here is to use the nuclei of atoms in complex molecules (one of the early experiments used caffeine, leading to countless "computing in a cup of coffee" jokes) as the "bits", and manipulate them using Nuclear Magnetic Resonance techniques (the same thing that makes an MRI scan work, though the medcial profession is too cowardly to leave the word "Nuclear" in there). They use large samples (billions of molecules), effectively doing the same computation in parallel, and use the redundancy to make the readout possible. They can't detect the state of a specific atom in a single molecule, but the sum of the signals from analogous atoms in many identical molecules is big enough to pick up.
The liquid-state NMR experiments burst onto the scene in rather dramatic fashion, and have drawn no end of media attention. It's a promising system in many ways, as the nuclei interact very weakly with the environment, and thus the decoherence rates are very low, while NMR techniques allow fine control of the quantum states of the varous atoms, and efficient readout of the final state. They even claim to have used Shor's factoring algorithm to factor the number 15 (the smallest number for which this isn't trivial).
Unfortunately, there are also major issues here, mostly arising from the fact that the experiments are necessarily performed at high temperature (i.e. room temperature). The biggest practical problem is that very few of the molecules start in exactly the right state to begin the computation (you want to start with all the bits set to "0", then put in the numbers you want), and they don't have the ability to put them there. Thus, they're stuck with whatever tiny fraction happens to naturally be in the right state, and that fraction drops off exponentially as they increase the number of "bits." In terms of scalability, it's a dead loss.
Worse yet, there's an argument over whether the NMR computation even does what they say it does, due to the fact that they're not working with pure states in single systems, but rather with huge ensembles of molecules of which only a few actually count (euphemistically called "pseudo-pure" states). Some people have published models showing that it's possible to simulate the NMR computations classically, which would seem to indicate that it's not really quantum computation after all. This has led to some of the nastiest fights I've ever seen among scientists, with wild accusations on all sides. Last I heard, this hadn't been resolved yet, but whether it's quantum or not, the scalability issues would seem to rule this out as a means to practical quantum computing.
Silicon: Silicon chip fabrication is, as everyone knows, the basis of our current computing technology, and we're really very good at making intricate structures of silicon. One of the most interesting proposals in the field is to make quantum computers in silicon. This is particularly attractive since it would allow quantum computing to make use of the tremendous infrastructure (both in terms of mass manufacturing, and also technical expertise) built up around conventional computing.
It's a very nice proposal, with the nuclei of single phosphorus atoms embedded in a silicon matrix serving as the "bits". They're nicely stable, decoherence times are reasonable, and a large array of them could be built onto a chip without too much trouble. As for interactions and entanglement, various electronic gates could be built onto the same chip, giving individual control and addressability for each phosphorus atom.
The killer problem here is the readout. It's just not practical to detect the state of a single phosphorus nucleus with anywhere near the necessary efficiency, using current technology. You can do the computation just fine, but that doesn't help if you can't read the answer.
So that's the bind we're in, at present, when it comes to quantum computing. There are nice ideas, and some promising schemes, but the best schemes aren't scalable, and the scalable schemes can't be read out. It's hardly a dead end (and, as Dave Wineland likes to point out, even if it never produces a working computer, research in these areas adds immeasurably to our knowledge of quantum physics), but there are major technical hurdles in the way of progress with any of these schemes.
Which brings us to the new paper that started this whole long essay. It's a proposal for a new method, which seeks to combine the best features of the liquid-state NMR and silicon chip proposals. Better yet, the authors claim that everything they propose can be accomplished with current technology.
The idea is this: take a chunk of silicon made up of two isotopes of silicon (having atomic weights of 28 or 30 mass units, which don't produce large NMR signals), and cut a stair-step pattern into it. Then add a third isotope, silicon 29, which will give a large NMR signal. The silicon 29 atoms wil arrange themselves into long chains of atoms (tens of thousands of atoms), running along the edges of the "steps". you can then lay down more silicon 28 and 30 to lock the chains in place.
These long chains of silicon 29 become the registers for your quantum computer. Each atom is a "bit", and atoms in the same chain can be entangled using NMR techniques. By building a small magnet onto the same chip, you can address each individual atom in the chain separately, and you can arrange it so that all the chains are identical, which solves the readout problem-- you can't read the state of a single atom in a single chain, but you can read the average state of ten thousand atoms in the same position in ten thousand identical chains. You do the same computation in parallel on all the chains, and read the whole ensemble at the end, just like the liquid-state NMR system.
The scalability problem which afflicts liquid-state NMR computation is dramatically lessened in the solid state (though not solved completely). Because the chains are built into a solid matrix of silicon, you can do the whole experiment at much lower temperatures (around 4 Kelvin, the temperature of liquid helium), and there are tricks you can play to put the atoms into the proper initial state, so you can use a larger fraction of the sample, and get away from the exponential decrease in signal that cripples liquid-state NMR.
This looks like a tremendously promising method, subject, of course, to the caveat that this really isn't my field, and I can't speak to the feasibility of the various manufacturing techniques the authors propose. They say that the necessary manufacturing processes are all well understood, and provide extensive citations, but I'm not a silicon person, and can't tell if they're being overly optimistic (it's published in Physical Review Letters, one of the most respected journals in physics, so they're at least not cranks). If their claims hold up, this could open the possibility of working quantum computation in the much nearer future than anyone expected. Whether it'll be enough to really solve practical problems (their graphs suggest a computer of a hundred or so bits would be practical, but that's not enough to make the NSA swoop in and classify the whole thing) remains to be seen, but it might be the first step toward a new revolution.
The biggest flaw I see in the whole thing is that the technique still depends on the "pseudo-pure" states which caused such a catfight in the liquid-state NMR systems. The polarizations they claim to be able to obtain here are much better than you can do in liquid-state NMR, but it's still an impure ensemble of states, and may be subject to the same objections raised with the earlier work. We'll have to wait and see how the whole thing shakes out, but whether this turns out to be a dramatic new development or just another over-hyped and impractical scheme, it's an exciting time to be in physics.
Alive, Dead, or Bloody Furious
Idly surfing past SciTech Daily the other day, I saw a link to a puff piece in Nature about quantum computing. "This would be good weblog material," I thought, and for good measure downloaded the actual article on which the piece was based (I can do that, being at a college site. If you're reading this from someplace other than a research institution, you probably can't get the full text. Not that it matters...). I'll discuss the details in a separate post-- first, a bit of blather explaining what "quantum computing" is and why people care.
By way of a general-audience introduction to one of my research talks, I once tried to make a ranked list of the Weirdest Ideas in Quantum Mechanics (most of which turn out to be involved in the work I did at Yale). It's hard to get a real solid order of them, but however you do it, the ideas of superposition and entanglement have to be right up there near the top. Superposition is the idea behind the famous "Schroedinger's Cat" thought experiment (You can also find it described in verse and try the experiment. The Web is a silly place.), and also what makes a quantum computer work.
The idea of superposition is this: as a matter of mathematics, if you find two valid solutions for a quantum problem ("A" and "B"), the sum of those two solutions ("A+B") is also a solution. Stated formally, it sounds reasonable enough, until you apply it to concrete physical situations: For example, I've got a "Still Not King" coffee mug sitting on my desk. Call Solution A the situation where the mug is to the right of my keyboard, and Solution B the situation where the mug it to the left of my keyboard. Quantum mechanically, the most general possible solution is the case "A + B", where the mug is both to the left of the keyboard and to the right. And, better yet, it stays in that state until I look at the desk to see where it is (effectively making a measurement of its position), and force it to be either on the right or the left. (It's to the right, if you care...). While we only ever detect objects in one particular quantum state or another, between measurements, they exist in a fuzzy, undefined condition in which they can occupy all possible states simultaneously.
This is at least as counter-intuitive as the whole idea of uncertainty, but like uncertainty, it's absolutely and verifiably true. Well, it hasn't been checked for coffee mugs, but "Cat States" of simple quantum systems have been generated using everything from small clouds of cold atoms to flowing currents of liquid helium. And we believe that it would work for coffee mugs as well, were it possible to actually do the experiment.
(Why doesn't this work with coffee mugs, you ask? The answer is complicated, and numerous books have been written on the subject, so it rates at least a post of its own, rather than just a quick aside in the middle of a longer essay. The short version is just to say that these states are very fragile. The more particles involved in the quantum system, the more chances the system has to interact with the outside world in a way which destroys the superposition state. For small systems which can be isolated from their environment, the superposition can be maintained for a significant amount of time, but for something like a coffee mug, containing millions of billions of atoms, the superposition falls apart essentially instantaneously. This will turn out to be critically important for quantum computing, and I may say more about it there...)
A fairly closely related idea is the idea of "entanglement." This describes a case where not only is the state of a particle not completely determined, but it's also bound up with the state of another particle (or many other particles). Entanglement arises when the states of particle X and particle Y are related in some definite way-- say, they're both in the same state. Each individual particle is in a superposition of states "A" and "B", and on top of that the states of the two are correlated-- if particle X is fund in state A, particle Y will be in the same state, and likewise, if one is in B, the other is also in B. The two normally independent systems have become entangled with one another, and behave as a single quantum entity.
Entanglement is one of the most difficult ideas in quantum mechanics, because it doesn't matter where the two systems are relative to one another. If I entangle two quantum systems here, and send one on a rocket to the Andromeda Galaxy, it's still entangled with the second system sitting on my desk, and a measurement made in my office in Schenectady will absolutely and instantaneously determine the outcome of a measurement made thousands of light-years away. This bothered Einstein so much that he spent much of the later part of his life trying to disprove or supplant quantum mechanics. That's a story for another post, but entanglement does have a role to play in quantum computing, so I have to at least mention it here.
What does all this have to do with computers? It's not just that your computer exists in a superposition of both "running happily" and "Blue Screen of Death" until you look at it (that's not quantum, that's just Windows), but rather that you can use this idea of superposition to build a different kind of computer. Instead of classical "bits" which take on the value of either "0" or "1", you use quantum bits, which can be both "0" and "1" at the same time. (These quantum bits are dubbed "qbits," pronounced "cubits" in one of the fits of cutesy nomenclature that plague my profession...)
The standard cocktail-party explanation of the advantage of a quantum computer is that this superposition business lets you perform operations not just on a single, definite number, but on all possible numbers at once. This means that complicated problems become relatively simple, and you can use a quantum computer to quickly solve problems which would take forever (defined here as "longer than the age of the Universe") to solve on a classical computer.
That explanation significantly over-simplifies the process-- finding algorithms which use the quantum character of a computer to speed up calculations is not a trivial problem, and the resulting algorithms require you to entangle the quantum states of various bits in the computer, and thus are a little more complicated than just doing the same operations you would with a classical computer, but in a massively parallel manner (here's a detailed formal explanation of the two "killer apps" in quantum computing). Nevertheless, the end result is the same: There are problems you can solve quickly with a quantum computer that you can't solve with a classical computer in any reasonable amount of time.
In particular, it turns out that a quantum computer, if it could be built, could potentially factor very large numbers. Since many modern encryption schemes are based on the idea that factoring very large numbers is an incredibly time-consuming process, this caused a bit of a stir in the intelligence community, and brought the National Security Agency into the business of fundamental quantum research.
The NSA is one of those ultra-secret agencies whose mere existence was classified for many years (an old joke is that the letters stand for "No Such Agency"). Now, of course, they have a Web page, which succinctly describes their mission as "Providing and protecting vital information through cryptology." These are the Code Guys for the US government, and their mission is to ensure the security of American codes, and crack the codes of our enemies. If a quantum computer can be built, they want it, and they're willing to pay big bucks to get it.
Their entry into the research-funding business caused a veritable explosion of interest in quantum computing. Since it was widely assumed that anyone drawing NSA funding would be working in the Infinite Money Limit, everybody and their extended families sought to find a way to tie their particular area of research expertise into quantum computing. There are literally dozens of proposals for different ways to build a quantum computer-- using trapped ions, trapped atoms (in a host of different schemes), cavity quantum electrodynamics, nuclear magnetic resonance on molecules suspended in liquid, "quantum dots", single phosphorus atoms embedded in silicon chips, and a host of other systems I can't think of at the moment.
Everybody has a scheme for how to make a quantum computer, given money, time, money, equipment, money, and eye-popping advances in technology. And some money. Everybody also has a list of reasons why all the other schemes won't possibly work, and there've been some ugly fights in the field. To say that most claims in the field of quantum computing need to be taken with a grain of salt is an understatement-- there isn't enough salt in all the seas to supply the metaphor for the necessary degree of skepticism. It's been an entertaining time to be in physics.
The new paper linked to above claims to present a scheme for building a working quantum computer using present technology. The authors claim to have found a scheme which combines a number of the best features of other methods, and works around the important technical issues. If it works, this is Big News. It's also a big "If".
Unfortunately, this post has gotten a bit long, and I do have a day job, so the details of what the issues are, and how this new scheme proposes to deal with them, will have to wait for another post.
Strange Women Lying in Ponds Distributing Swords
OK, one more comment on the Den Beste thing (yeah, yeah, yeah, it's taken all of three days to break my stated rule against multi-posting. Bite me, this'll be short). At the end of the post about the BBC, we have:
At least, that's what I hope it says. The fear is that Powell and the State Department won't give up and will continue to fight to weaken this position. I hope not; I hope that Powell understands that Bush has now made a real decision and it is Powell's job to carry out this policy and not to subvert it. It's virtually certain that Powell will be embarking on a whirlwind tour of a number of important capitols, and I hope he realizes that it's his job to say to each head of state he visits that President Bush really does mean what he says about this.
Is it just me, or do most of the warblog crowd make Bush sound like some sort of ineffectual feudal overlord out of a bad fantasy novel? While the King sits in his White Palace and issue proclamations, the scheming Duke of State runs around pursuing his own nefarious plans, and tries to sway the King from the path of righteousness. Meanwhile, the noble Baron of Defense offers steadfast service and bold strategy, and the wise and kind Enchantress of National Security offers wise counsel.
You can almost see Ian McKellan playing Donald Rumsefeld in the inevitable movie. (Well, OK, maybe not.)
Still is this any way to run a country? Maybe American-style government isn't what the Palestinians need after all...
Let Slip the Blogs of War
The bloggerati are all a-buzz this morning over Bush's latest Big Speech, this time regarding the Middle East. Steven Den Beste crows:
As long as the Palestinians prosecute the war against Israel, and as long as they refuse to clean up their act and become a civilized people with an honorable government, which isn't riddled with corruption and arbitrary dictatorial rule, then the Palestinian people will continue to suffer and the US will not do anything to relieve that suffering.
the famed Instapundit proclaims:
NOT SO WOBBLY NOW: Bush says become a Western-style democracy and then we'll talk peace. And no more Arafat -- we're tired of that terrorism shit, dudes, and we're not fooled.
This isn't just an Israel / PLO thing, it's a signal to some other people. (Look at the passage following the reference to "people of Muslim countries" in the text).
and Andrew Sullivan writes:
And the key message is that Israel must have a viable partner, a democratic partner, if peace is to be secured. It cannot be secured while today's psychotic Palestinian culture and chaotic polity remains in place. And the 1967 borders - give or take a little - are the obvious future contours of the Jewish state. The president was right to appeal to innocent Palestinians over the heads of their corrupt leaders; and he is right to stress hope. I'm sorry to say, however, that seasoned hands will see precious little reason for any.
All of this, I basically agree with. Does the Palestinian Authority need reform? Absolutely. Should they become a Western-style democracy? Absolutely. Everybody should become a Western-style democracy (except maybe Cuba, just because it's so much fun to see the occasional impotent, spittle-flecked rant about how Eeeeevil they are). Is stable, non-autocratic, non-corrupt government in the Palestinian territories a necessary prerequisite for peace, and even meaningful negotiations for peace? Absolutely. The best hope for the region and the world would be to establish a Palestinian state with a stable, trustworthy, and reasonably free government.
Without continued and deep American involvement in the region, many critics are likely to argue that the president has laid out a vision for peace, he has little faith in ever seeing fulfilled.
In the short-term the Bush administration will now have to sell the plan to the Palestinians and their Arab supporters.
It won't be easy and is in many ways a task filled with irony.
Walsh evidently didn't understand: this wasn't a peace plan. It was a policy statement.
In a sense, this was a peace anti-plan. It was a statement by the United States that our government was no longer going to try to jump through hoops to try to appease the Palestinians and stop their violence, or to restrain the Israeli retaliation. The United States is no longer going to try to deal with a blatant liar like Arafat at all, because he can't be trusted. The United States isn't going to force Israel to make major concessions. The United States is emphatically not going to take an Arab point of view.
I really, truly, hope he's wrong. Because I think the BBC writer grasped the real point far better than DenBeste and Reynolds (and possibly Sullivan, though he does at least strike a note of caution).
The best (and probably only) hope for the region does involve reforming the Palestinian Authority, replacing Arafat (or kicking him upstairs to a more symbolic post, which the Bush speech doesn't actually rule out), and at least steps toward making a Western-style state out of the current mess. These are not, however, easy things to accomplish, or things that will take place overnight.
This isn't Star Wars where simply knocking off the head of the Evil Empire is enough to make computer-generated peasants and midgets in furry suits dance with joy at the ushering in of a new era of peace and liberty. If Yasser Arafat were to disappear off the face of the Earth tomorrow (in some hypothetical manner which couldn't be blamed on the Mossad), it wouldn't solve the problem. For one thing, it's not even clear who would replace him, let alone whether they'd represent an improvement in terms of appalling corruption and autocratic rule.
Building a stable government in the Palestinian territories will require an organization which is reasonably popular, not (in)famously corrupt, and able to provide basic services to the citizens (which the existing Authority isn't particularly good at). Two possibile candidates spring to mind: Hamas and Hezbollah. Whoops. They're pretty much disqualified on the grounds that they're not likely to set up a Western-style democracy, and they have a small credibility problem when it comes to negotiating with the US, let alone Israel... (They also don't have the necessary breadth of support, despite what you may think.)
It's not immediately clear that there is any reasonably moderate organization in the Palestinian territories which could serve as a platform for setting up a government we'd be willing to deal with. Even if there is (there was a talk on campus by former Ambassador Dennis Ross, who seemed to think there were some candidates on the municipal level), a Western-style democracy is something that will need to be built from the ground up, and that's unlikely to happen without significant involvement and support from the United States and Europe. You won't magically get a trustworthy Western-style state by simply changing leaders-- it's a slow process of building the various rules and institutions, and faith in those rules and institutions, that you need. There are countries out there that've been at this for a long time, and still don't quite have the hang of things (Japan comes to mind)-- a democratic Palestine isn't going to emerge quickly without a lot of support, financial and otherwise. We don't have to give the money to Arafat, but there's going to have to be money given to somebody to get the proocess moving, and kep it moving.
If we're really in this business for the sake of peace, and the general betterment of mankind, the BBC piece is exactly right. Western democracy isn't going to spring up overnight, and nothing will improve if we simply stand back and wait for it to crystallize out of thin air. There's a better chance of God Himself intervening to solve the problem than there is of a "viable partner" for Israel emerging spontaneously from what's there now.
If, on the other hand, this statement is intended merely as political cover for washing our national hands of the whole business, and moving on to raining fire and death on other countries which have pissed us off (which is what I suspect Den Beste (whose personal hatred for Arafat is truly impressive) and Reynolds are after), then what we can expect to see is just more of the same, with Israel whipping up on the Palestinians, punctuated by the occasional suicide bombing. Both sides will continue to bleed slowly, and things will get a whole lot worse before they ever get better.
Feh. Just writing about this stuff makes me want to go take a shower. Tomorrow, back to physics, and some comments on quantum computing.
Move Out of Your Parents' Basements
A week or so ago, when I was talking about setting up this web log but hadn't gotten around to it, Kate sent me a link to the Insultingly Stupid Movie Physics site, which she'd seen on MetaFilter. Thinking that this would be good blog fodder, I took a look, and had a Shatner Moment.
I don't mean that I suddenly started putting big... pauses... in... my speech, while runningotherwords... together-- instead, that's a reference to the SNL skit where William Shatner blows up at a Star Trek convention, thundering "Have you ever kissed a girl?!?" at Jon Lovitz (who's wearing an "I Grok Spock" T-shirt and pointy ears). Don't get me wrong-- the movie physics complaints are all perfectly valid, physically speaking (I've even considered using the "Blown backwards through a window" business as an exam problem). They also largely miss the point, in a way which suggests the site was put together by the sort of humorless dork I try very hard not to be. (Says the man with a "Don't Drink and Derive" drink cup sitting on his desk...)
Most of the specific complaints are constructed with the Jerry Bruckheimer class of action movies in mind, and many of them are things that I've remarked on myself (the fact that movie cars seem to have their frames packed with dynamite before leaving the factory, for example, and who hasn't commented on the way that movie guns never run out of bullets?). But it's important to keep some sense of perspective about these things, particularly when it comes to the specifc genres in which the movies are made.
Complaining about the exploding cars in a typical action movie makes about as much sense as complaining (as the history book I was reading last night does) that Henry V didn't actually speak English, and thus would've been hard pressed to deliver the "Once more into the breach, dear friends..." speech, let alone the "St. Crispin's Day" speech. Yeah, fine, strictly speaking, Henry was Henri, and spoke French, but realism isn't the point-- the point is that those two speeches are some of the most rousing speeches ever written in English, and the historical scene is just a stage to let the actors and audience glory in the beauty of Shakespeare's language. (And to provide a little royal propaganda, as the book went on to note.)
OK, I'll admit that, say, Hard Boiled isn't exactly Shakespeare, but the tuth is that action movies are at least as stylized as Shakespearean drama, and probably closer to Kabuki theater. Realism isn't the point-- the point is to present a sort of visual symphony of creative mayhem. If that requires gratuitous slow motion, exploding cars, sparking bullets, and Chow Yun-Fat diving across a room with a pistol in each hand firing a hail of bullets through plate glass windows for no good reason, then so be it. (Apparently, it does require all those things, plus the opposition of the most pointlessly destructive criminal organization outside a Warner Brothers cartoon...)
Which isn't to say that there's no limit to idiocy in the name of spectacle-- Armageddon was a dreadful movie, and they hit most of the reasons why (leaving out only the nauseating Animal Cracker Foreplay, which isn't physics, and thus not within their purview)-- nor that there's no place for citing unreality in reviewing a movie-- Red Mike's Reviews use this sort of thing to devastatingly humorous effect. But as Teresa Neilsen Hayden notes in The Evil Overlord Devises a Plot, just because something's a cliche doesn't mean it can't be effective, and I'd extend that statement to cover stupid movie physics as well. Just because the physics is stupid doesn't mean you can't get $8 worth of entertainment out of Things which Go Fast and then Blow Up.
I also don't want to give the impression that the site is nothing but obsessive dork-itude. While the front page was real Shatner Moment material, the specific reviews are significantly better, and even manage a little self-deprecating humor (in the somewhat stiff and didactic vein favored by professors everywhere). And even though they have a few gripes with the physics, they do allow that the safecracking scene in The Score is pretty damn cool, so they're not completely hopeless.
Strained Analogies R Us
Having complained about strained analogies involving the idea of uncertainty (in my previous post), I should note that it's tempting nevertheless to make a strained analogy between politics and physics, and posit some sort of conjugate relationship between liberty and security (probably backed up with a Ben Franklin quote, and some pithy remarks about John Ashcroft). Of course, uncertainty doesn't work that way-- as noted below, it says nothing about the absolute values of quantities. Increasing liberty doesn't necessarily decrease security, nor can you guarantee an increase in security by reducing liberty.
The most you could say with this sort of thing would be that chipping away at civil liberties (reducing both the amount of freedom and the associated uncertainty) through ill-advised legislation serves only to increase the uncertainty in security. Maybe it makes us safer by preventing terrorist attacks, maybe it makes us less safe by encouraging gross abuses by the government. There's no way to say for sure.
Actually, there may be something to that after all...
What's With the Name?
The obvious question to ask (and thus for me to answer) at this point would be "So, what's with the title?"
There are a number of answers to that (I'm easily amused by minimally clever double and triple meanings. For one thing, it's in the tradition of self-deprecating weblog names (the most obvious antecedent being Jim Henley's Unqualified Offerings). Another answer would be that it's a reference to the mutability of the high moral principles used in politics.
The main answer, though, is that it's a physics thing. It's a reference to the famous Heisenberg Uncertainty Principle. A few of the better-known mathematical expressions of the Uncertainty Principle are shown over on the left. (Which also answers the next obvious question, "What's with the Greek letters and stuff in the links menu?")
Uncertainty (the German word Heisenberg used is actually closer to "indeterminacy", which would be a better name for the concept, but "uncertainty" caught on, and we're stuck with it) is one of the pillars of quantum mechanics. In its most famous form, it states that the product of the uncertainty in the position of a particle (Δx) and the uncertainty in the momentum of the particle (Δp) must be a number greater than some minimum possible value. That is, it's impossible to know both the position of a particle, and its momentum with arbitrary precision. The better you know where something is, the less you know about how fast it's moving, and vice versa. Similar relations exist for many pairs of "conjugate variables"-- the other two forms cited on the left relate the energy of a quantum state and the time you have to look at it, and the number of particles in a quantized field and the quantum-mechanical phase associated with that field.
This is one of the hardest ideas in physics to get your head around. It more or less destroys the idea that it could be possible to know absolutely everything there is to know about a given physical system, and places fundamental limits on the kinds of things you're allowed to know-- this pretty much overturns the prior philosophical basis of physics. Worse yet, it's completely counter to our everyday experience. As I note when I give research talks about this stuff, in everyday life, you really don't notice this. When I'm in my car on the highway, I'm pretty certain about my position, and a cop with a radar gun will be happy to tell me exactly how fast I was going. So where's the uncertainty?
The answer is that the minimum value is an awfully small number-- on the order of h-bar, or Planck's Constant divided by two pi: 1.055 times 10^-34 (or, writing the whole thing out, 0.0000000000000000000000000000000001055). You're just never going to notice that in dealing with macroscopic objects. But when you get down to the atomic scale, where quantum effects really become apparent, quantum uncertainty is an important effect. It turns up all over the place in atomic physics (the field I work in). The uncertainty relationship between position and momentum determines the observed size of Bose-Einstein Condensates. The uncertainty relation between energy and time gives you the lifetimes and energy widths of atomic states. And in the shameless self-promotion department, the number-phase uncertainty relation got me a paper in Science (I won't link directly to it, but there's a good artcle about it in Physics World-- scroll down to the stuff about "squeezed states").
Uncertainty is also one of the most abused and misunderstood ideas in physics. A common mis-statement of the principle is that the act of measuring a quantity destroys your ability to measure other quantities. While that's the classic thought experiment used to try to understand uncertainty, uncertainty is more fundamental than that. It's a fundamental consequence of living in a quantum world, and a direct result of the fact that everything is made up of particles with wave-like properties. If you ask what it means to define a position for a particle which acts like a wave, or the momentum of a bundle of waves acting like a particle, you find that uncertainty is inescapable.
Uncertainty is also frequently mangled in science fiction (which anyone who's read my book log would know is an interest of mine), usually as a means of obtaining really fast space travel. "We'll define the position really well," says the would-be author, "which means the momentum becomes uncertain, and then it's going really fast." This runs into two problems: first, the fact that Planck's constant is a really small number, and you're just not going to get much of a velocity spread for your multi-ton space-ship; and second, the fact that uncertainty just doesn't work that way. The uncertainty principle says nothing about the absolute value of position or velocity, but only about the uncertainty. You don't get to pick your velocity after a position measurement, it's chosen randomly, and as likely to be zero as anything else.
The worst abuses usually come in the form of analogies, particularly in fields far removed from physics. I've thrown more than a few articles away in disgust after reading attempts to use quantum mechanics to argue in favor of post-moderninsm, or post-structuralism, or post-whateverism.
Having griped about people mis-using the concept of the Uncertainty Principle, I should at least take a minute to recognize someone who mostly got it right. Which also provides an opportunity for shameless-self-promotion, because I can link to my book log post about Michael Frayn's play Copenhagen. Frayn takes the idea of the Uncertainty Principle, and the mysterious meeting between Werner Heisenberg and Neils Bohr in 1941, and spins them into a surprisngly moving (for a play consisting of three people talking about physics and history) play incorporating a number of fascinating details about the history of quantum mechanics. I definitely recommend seeing it.