This page will look much nicer in a browser that supports CSS, or with CSS turned on.

Uncertain Principles

Physics, Politics, Pop Culture

Saturday, August 03, 2002

Communists in the State Department, Arabs on the Donor List

Right up front, I'd like to apologize to those who come here looking for funny physics stuff. The following will be a bit of a rant, so if you're put off by bile, well, go hang out in MC Hawking's Crib, and give the rest of this a miss.

Yesterday, I stumbled across what may well be the most disgusting thing I've read since September 11th last year. In a world containing both Ted Rall and Ann Coulter, that's really saying something. Just linking to this filth makes me want to take a shower.

It seems that Scott Koenig over at "The Indepundit" decided to look into the fund-raising records of Rep. Cynthia McKinney, the outspoken Democrat from Georgia who has drawn the ire of the warblogosphere, basically for failing to get behind the current program of raining fiery death upon Middle Easterners who have displeased us. (To be fair, she's said some astoundingly stupid things in the process, and I am no huge McKinney fan). Mr. Koenig's big finding? She reported $13,850 in campaign contributions on 9/11/01, from people with Arab-sounding names. He ends his post with a smug "Probably a coincidence," but the implication is clear: McKinney's taking money from Arabs, and therefore must be entirely in Al Qaeda's pocket, paid to spout anti-American gibberish. The people in his comments thread certainly jumped all over it.

Let's leave aside for the moment the mind-boggling idiocy involved in thinking that Al Qaeda might judge it a good investment to buy the allegiance of a single member of the House of Representatives, let alone a member of the minority party, let alone a member of the minority party who was already know as a bit of a wing nut. Their $14,000 would've been better spent buying the allegiance of a Yemeni busboy at the Hawk and Dove, in hopes that he might be able to slip Tom DeLay a mickey, but let's leave that aside.

Let's also leave aside the fact that the donations listed on OpenSecrets' web site tend to be reported on only one or two days a month, almost certainly recorded by either the date on the paperwork filed with the FEC (or whoever it is that gets that stuff), or the postmark on the envelope it was mailed in. That incriminating "9/11/01" was almost certainly the work of either a congressional staffer filling out forms late on 9/10/01, or a postmark stamp from the USPS (and even on that darkest of days, the mail still came). But let's forget about that--let's assume for the sake of argument that all of those checks really were donated exactly on September 11th.

Let's leave aside all of the justifying arguments, and elided details, and rational explanations, because, at the core of it, none of that matters. Stripped down to its thoroughly vile essence, what do we have here? A list of Arab-sounding names on a fundraising report.

Well, thank God for you, Tail-gunner Joe! We'll string her right up from the nearest tree. I mean, everybody knows those filthy camel-jockeys are all terrorists, and anybody who took money from them must be tainted.

And hey, while you're at it, you better ship me off to Guantanamo Bay, too-- a close colleague of mine is of Iranian descent, and we all know they're Evil (the President said so!), and I have lunch from time to time with another Iranian, who's a devout Muslim. Yep, slap on the cuffs, and strip me of my civil rights, I'm consorting with terrorists.

I am thoroughly sickened by this. This argument is so utterly revolting, I have trouble keeping my hands steady enough to type. McKinney's said some idiotic things, but nothing she's said or done can possibly justify this McCarthyite horseshit. The people crowing about this on the Indepundit site and elsewhere are lower than the slime that pond scum scrapes off its shoes, and I would've expected better of Jim Henley.

Several people, myself among them, have raised the dark spectre of the Japanese-American internment camps, and McCarthy-style blacklists, when discussing the civil rights concerns raised by recent government actions. Until this weekend, I've always thought that was little more than a rhetorical scare tactic, that the memory of those dark chapters in American history would be enough to restrain the worst excesses of the War on Terrorism. With this poisonous idiocy entering the major media, I'm no longer so sure.

If we're going to go down this road, and I fervently hope we're not, I hope we can at least get a Joseph Welch to stand up and ask "Have you no sense of decency, sir?" before irreparable damage is done.

Posted at 2:10 PM | link | follow-ups | 6 comments

Friday, August 02, 2002

May There Be No Moaning of the Bar When I Choose Answer "C"

Kate took the New York Bar Exam this week, on Tuesday and Wednesday, which got me thinking about the topic of horrible transitional professional rituals. Academia in general (and physics specifically) lacks the sort of standardized, institutional trial by ordeal that the Bar provides (plunging your bare hand into boiling water might be preferable to two hundred multiple-choice questions about English Common Law), but there are a couple of roughly similar trials most Ph.D. scientists face. Three, if you count the GRE (the Physics GRE is a miserable experience), but we'll leave that aside.

The first big hurdle most people face in graduate school, and the closest thing academia has to a Bar Exam, is the Ph.D. qualifying exam. This is usually at the end of the first year of grad school (some places put it at the end of the second year, and a few forgo it altogether), and takes the form of a long comprehensive exam covering everything you're supposed to have learned in the required courses for the program. At Maryland (and many other places), this was stretched over two days, the first day (for the Chemical Physics program I was in) consisting of questions of thermodynamics, statistical mechanics, and "quantum chemistry" (basically molecular physics), the second day consisting entirely of questions about quantum mechanics.

I actually bombed this exam thoroughly. The first two (of three) problems on the first day were a cake-walk, but the third was completely out of left field. It asked about a topic we had never covered in class, and I couldn't get anywhere with it. I spent close to an hour staring at the damn thing, and couldn't get past sub-part a). I wound up writing the pathetic little "If I knew how to solve the first part of this, I'd take the answer from that part, and do the following..." essay that you write in hopes of getting at least a tiny amount of partial credit. (In the unlikely event that any of my students are reading this, and you find yourselves in this predicament, know that I have the utmost sympathy for your plight, and will actually give a fair amount of partial credit for that...)

That question rattled me so badly that I went into the second day in a very fragile state. When I didn't immediately see how to answer the first question, or the second, I was done. The third was something I knew how to do, but by that point I was a broken man. That night, seeking to get my mind off physics, I went to a movie. Unwisely, I chose to see Natural Born Killers, and found myself thinking "Well, the physics thing hasn't worked out, but I could still shave my head and go on a multi-state killing spree..." (On top of that, it's really not a very good movie...)

(Obviously, I didn't end up shaving my head and killing a lot of people. I had to take an oral exam as a make-up, some months later, which I passed with little trouble. The quantum questions, seen in calmer circumstances, were actually not all that bad, though I still have no idea how to do the miserable thermodynamics question that wrecked the whole thing for me...)

The other big liminal ritual in academia is the Ph.D. thesis and defense. After passing the qualifying exams, you're mostly done with course work (most people take a few more classes, on specific areas of interest for their research), and throw yourself full-time into research. In order to receive a Ph.D., you're basically required to become the World's Greatest Expert on something nobody else really cares about. (In my case, that was ionizing collisions between laser-cooled atoms of xenon in a metastable state. See, you don't care about that, do you?) You spend several years (Nathan helpfully provided the statistics on time-to-graduation for physics in a comment thread associated with an earlier post) piling up research results, and hopefully a few publications, and then you sit down and write a book about the results. Then you get together a moderately large group of professors, give a short research talk about the work, and then they quiz you on the presentation, the book, and, incidentally, anything else they happen to feel like asking you about.

This is an exceptionally trying time, summed up reasonably well in, of all things, an article on ESPN's Page 2 site (the thesis stuff is buried in an otherwise daft article about the glories of the Rocky soundtrack):

All your insecurity demons come out of the closet when you write a dissertation. Every sentence, every phrase, is a chance to become more convinced of your inadequacy. Somehow -- hypnosis, therapy, fear of disappointing your parents, a stout glass of wine at noon and another one at 5 o'clock every day -- you get it done. But that's all it is: done. It's not good, and it's not worth reading and it's nowhere near the project you hoped you'd have when you started. You can barely stand to look at it.

That's what makes the last days before the defense so painful: You're sitting around waiting for the committee to ask you what the hell you thought you were doing when you tried to put this flimsy, illogical piece of sham scholarship over on them.

In my particular case, the problem was compounded by two things: first, my advisor at NIST, Bill Phillips, is legendary in the Atomic Physics community for asking tons of questions of speakers. And since he's a brilliant guy, and usually grasps the point of the talk very quickly, these are generally very good, and very difficult questions (and if he starts a question with "don't you really mean to say.." the answer is always "yes"). Years of working in the Laser Cooling Group had left me with a mortal dread of hearing Bill say "now, before you leave that slide..." when giving a talk.

The other big factor was the "outside the program" member of the committee. As a check against distinguished professors putting together sham committees to rubber-stamp favored students through, each dissertation defense committee is required to have at least one member who comes from outside the degree program in question. As there were tight time constraints on my defense, this person was found for me by the chairman of the Chemical Physics Program, and turned out to be a very sharp Condensed Matter physicist from the superconductivity center. Unfortunately, he was also a very demanding guy, and when I met with him to schedule the defense, he presented me with a list of conditions that had to be met before he would agree to be on the committee, and regaled me with stories of past students who had failed to meet these criteria, and had "had a very hard time of it, I'll tell you that." My insecurity demons working overtime, I was completely convinced that he would find something absolutely dreadful in my thesis, and nail me to the wall.

Of course, I was overreacting-- if they thought I wasn't qualified, they never would've let me schedule the defense, and, it would be bad form for an advisor to hammer a student during that student's thesis defense. And, of course, as I noted when I talked about research talks in general, if you're any good, you know the subject matter inside and out, upside down and sideways. The outside-the-program committee member blew up at one point when I used a term that's often misused in his sub-field ("ballistic"), but once I understood his objection ("Who came up with this term 'ballistic,' anyway?!?!" "Um, Galileo? I don't know-- is this an etymology question?"), I was able to answer it.

There's a famous story about Isaac Asimov's thesis defense (he says coolly, and then can't find a good link for it... It's recounted near the bottom of this page, which is really about something else entirely), where one of the examiners asked him a question about a spoof research article he'd published under a pseudonym in a pulp magazine. He said later that at that moment, he knew he'd passed. I didn't publish any spoof articles, but I had a similar moment when one of the committee members asked "You've told us a lot of wonderful things about xenon, but do you know who discovered it?" ("I don't know. Lord Kelvin. Do you have a real question?")

(The answer is Lord Rayleigh. In case you care. Which you don't.)

So that's the physics version of the trial by ordeal. I'm not sure how the academic versions of this really compare to the Bar Exam-- I didn't shell out two grand for eighty pounds of review books and six weeks of taped classes on how to pass my thesis defense, but then I did spend six years making a pittance working with lasers in a room with no windows. Tough call.

The clearest common point between the Bar, the qualifying exam, and the thesis defense is that just finishing the damned things is a huge relief. The best part of passing the test (I have little doubt that Kate passed (she is, after all, smarter than I am, however much she denies that)) isn't the official reward that awaits, but rather the knowledge that, whatever else might happen, you never have to do that again.

Posted at 10:16 AM | link | follow-ups | 12 comments

Thursday, August 01, 2002

Another point from the same article about the drop in grad school enrollments:

One interesting note is that the enrollment of "traditional" science students (white males) has been declining for a long time, much longer than the last 10 years, according to the NSF study.But the overall number of graduate students remained unchanged, due to increased numbers of both female and foreign students.

Enrollment of foreign students, in particular, ballooned in the '80s and '90s. Many of these students ended up settling permanently in the United States, but statistically, about half returned to their home countries. These top-ranked scientists then set up university departments of their own, and continued collaborating with their US colleagues. Now, for the first time in decades, foreign enrollment in American science programs is actually declining. That's probably good for global well-being, but it also means that an important source of science students (as well as American-immigrant scientists) is drying up.

There's a sort of inside-baseball element to the foreign student situation that's not really touched on here. While part of the problem is that foreign universities have improved to the point where students will remain in their home countries rather than come to the United States for grad school (a net positive, in the long run-- the more good universities there are in the world, the better), there's another aspect as well: in many ways, foreign students started to become more trouble than they were worth.

In 1999 and 2000, I had a lot of conversations at conferences about the foreign student problem, mostly in regard to Chinese students (the Chinese were the worst offenders), who were causing all sorts of headaches at a number of universities. First of all, the TOEFL (Test of English as a Foreign Language) scores submitted for Chinese students turned out to be an absolute joke. It's not clear whether there was institutionalized cheating in China, or they had just figured out how to game the test, but a distressingly large fraction of the Chinese students entering physics graduate programs turned out to speak essentially no English. (One professor I talked to said that their department had been fortunate enough to have a faculty member spending a sabbatical year working with a collaborator in Shanghai. They paid for Chinese applicants to go to Shanghai and be interviewed in person, which let them weed out the ones who really didn't speak English.)

An even bigger problem was with students who would enroll in a physics program, then show up and take a bunch of computer science or engineering classes before transferring into a CS or engineering program. The CS and engineering programs would fill up their classes faster than physics programs did, so students were using physics programs (and physics department funding) as a way to get into the US, despite having no real intention to pursue a physics degree. Worse yet, they would frequently take two or three TA jobs in different departments, to get extra money. Put together with their lack of English ability, this meant that everyone, from the faculty (whose cheap labor pool was draining into other departments) to the other graduate students (who were squeezed out of TA jobs), to the undergraduates (who had to take classes taught by disinterested foreign students with minimal English) was miserable.

Most of the decline in foreign student enrollment is probably due to students choosing to remain home in greater numbers, as the article suggests. (Further foreign student reductions may result from post-9/11 monkeying with the INS-- then again, maybe not. It's hard to say what's going on with the INS). Some chunk of it, though, is due to the fact that many universities (at least based on an informal and unscientific survey a couple of years ago) have decided to crack down on some of these abuses-- applicants are screened more carefully, and the rules governing transfers between departments have been tightened up.

So there's your juicy physics gossip tidbit for the week...

Posted at 11:55 AM | link | follow-ups | 6 comments

SciTech Daily links to an interesting article in the Christian Science Monitor about the recent drop in the number of grad students:

For several years now, a complaint has been heard in the hallways of our top universities: where have all the graduate students gone? Every year, there seem to be fewer and fewer qualified students applying for positions in science and engineering doctoral programs.

The problem is far from anecdotal. Now, with statistics compiled by the National Science Foundation, professional science organizations, and the federal government, it's official. Prospective students are turning away from careers in science. Since a peak in the early 1990s, the number of science and engineering students has tanked. In some fields, the decrease has been as much as 5 percent per year, according to a study published by the National Science Foundation. In electrical engineering, enrollments have dropped nearly 30 percent in the last 10 years. Overall, the number of Ph.D. students in science and engineering is at a 40-year low, and there is little sign of a turnaround.

The author goes on to attribute this to a number of factors, chiefly the fact that professional scientists aren't all that well-paid, and that working conditions for junior scientists, particularly post-docs, aren't all that great. Being a junior scientist, and one who did a post-doc (and wrote a letter arguing that it wasn't all that bad), this is a topic somewhat dear to my heart.

I think the piece hits on the right explanation, but for the wrong reasons. Obviously, I can't speak for scientists in other fields, but the salary I received as a post-doc at Yale (in the high $30,000's) wasn't "appalling", and wasn't too far down from what a junior faculty member is paid at a small liberal arts college. Or, for that matter, a recent law-school graduate taking a job in public interest law. Colleagues of mine who did post-docs at large government labs (NIST, LANL, LLNL, NIH, etc.) would take a pay cut to move to a small college academic job, and NRC post-docs would probably take a hit at a research university as well (I'm less certain of the pay scale at larger institutions). Now, granted, I worked for a very well-funded group, but I wasn't three standard deviations off the mean salary for a physics post-doc.

In a larger sense, though, anybody going to graduate school already knows that a science PhD isn't a path to vast riches. You should only go to graduate school-- in science or anything else-- if you really, truly love doing research in your chosen field, and are prepared to put in long hours for little pay to pursue that goal. The work has to be its own reward, or you'll never make it through (particularly when you get to the Thesis Hell stage, which Brad DeLong describes fairly well (albeit for economics, not a physical science)). In physics, at least, being a post-doc is a significant step up from being a grad student, and science post-docs are treated much better than medical doctors at a similar stage in their careers.

What's reduced the supply of grad students is not just the lousy pay and long hours-- the people who are inclined to go to grad school aren't in it for the money. If it were just about money, nobody would ever go to grad school in the sciences. Coming out of college, I easily could've taken my physics degree and gotten a job on Wall Street making four times what I made going to grad school. I didn't do that because I would've been bored blind by the work, and the hours they expect really aren't any better than grad student hours, which are at least spent doing interesting work. Hell, I could've becomes a high school teacher, and doubled my salary.

What's drawn students out of grad school over the past ten years or so has been the tech bubble. Physical science, engineering, and computer science all draw on roughly the same pool of proto-geeks-- people entering college with some interest in pursuing their education in a technical field. Some of these students will be locked into a particular track from the beginning, but many of them are a little vague, and could end up in any of those fields, given a slight push.

The dot-com madness of the late 90's was a very big push. Truckloads of money (well, OK, stock options) were being rained on anybody who could find the power button on a computer and use "Java" in a sentence, a set of people that includes most of the tech types who could end up in grad school. And while computer coding strikes me as fairly dull work, it's a whole lot closer to the sort of thing I find intellectually stimulating than Wall Street work is. Had I been on track to graduate college in 1998 rather than 1993, I would've been sorely tempted to take more computer classes, and get a job working for a dot-com, tempted in a way that I simply wasn't tempted by financial jobs. (A friend of mine who worked for Goldman Sachs after college tried to talk me into applying for those sort of jobs, but they really held no interest for me.) People with a less clearly defined interest in physics than I had almost certainly would've gone the computer route-- they would've been fools not to.

With the bottom dropping out of the tech sector, and a lot of those jobs drying up, I'd actually expect to see graduate school enrollments start to tick up. It'll take a few years to clear out those who are already in the computer pipeline, as it were, but in a few years, other sciences will start to seem more attractive to incoming students, and more of them will choose graduate school. Especially if the economy continues to flounder a bit-- grad school is a fairly recession-proof occupation. The numbers won't come back to their early-90's high point (even though the bubble has popped, there are still more jobs for right-out-of-college computer types than there were in the early 90's), but they'll creep up a little, and reach a new equilibrium, at least until the next technology-driven speculative bubble hits.

Posted at 11:16 AM | link | follow-ups | 2 comments

Wednesday, July 31, 2002

Everything I Need to Know I Learned From Watching Spiderman

There's been a fair bit of blogging today about the Keith Olbermann article in Salon, where he lays into Ann Coulter and the media in general for ignoring early warnings about terrorism in favor of more glamorous topics like Bill Clinton's sex life. Olbermann's always been a little erratic, but when he's on, he's very good, and he's mostly on here.

What struck me most was this passage, though:

But I tend to think Rose is a lot closer to understanding what he did, and why people hold it against him, than is Ann Coulter. Since Sept. 11 she has been a veritable out-of-control firehose of venom, whipping around crazily, streaming invective wherever she happens to point. I wouldn't be so disturbed if I sensed there was a glimmer of irony in this new book of hers, some quick wink of Buckner-like acknowledgment that "Slander" might be read not as a title, but as a description of the contents.

I had exactly the same thought ("At least she went for truth in advertising") earlier today, when I saw her book in a store while I was running other errands. Of course, even there, she's wrong, having forgotten the words of J. Jonah Jameson: "Slander is spoken. In print, it's libel."

Posted at 4:12 PM | link | follow-ups | 3 comments

The Drexler Continuum

Somewhat ironically, my earlier post about cryonics was prompted in part by one of the explanations offered for how future doctors would be able to resuscitate frozen patients-- that "nanomachines" would be used to repair the damage caused by freezing on a cell-by-cell basis. It's ironic because, while this was an explanation offered to make cryonics seem more plausible, if anything it increased my skepticism on the subject. "Nanotechnology" is the "nuclear" of the new millennium-- it's claimed as the mechanism for all sorts of pie-in-the-sky bits of technological wizardry that we're assured will certainly be coming any day now, in just the same way that an earlier generation of futurists thought nuclear power would be the cure for all our energy woes. Invoking nanotechnology, particularly the extravagant claims of the Drexler branch of the field, as the mechanism for some future technological wonder draws an almost reflexive "Yeah, right..." from me.

That's not to say that nanotechnology is cold fusion writ small, just that it's rare for any technology to really follow exactly the trajectory suggested by its most fervent proponents. I think we'll get wonderful new gadgets out of the study of very small machines (a recent Physics Today article (sorry, no link) points out that we've already got some), I just don't really believe that they'll be based on tiny robots constructing useful devices atom by atom, or little free-roaming von Neumann machines in the bloodstream repairing damage on the cellular level. Those claims have the same sort ring of True Belief as past claims of "electricity too cheap to meter," the sound of people who have made the leap from "wouldn't it be cool if we could do this?" to "we'll definitely be able to do this, and won't that be cool?" without passing through the intervening layers of careful investigation of the real potential of a technology.

Science and technology move forward, but we're really, really bad at guessing the directions they'll take, or even what problems will prove difficult. I don't read a whole lot of "futurist" books, but I do read a good deal of what booksellers term "genre fiction," mostly science fiction and fantasy. The "hard SF" sub-genre fairly closely tracks the current conventional wisdom about the future of technology-- essentially, the past hundred-odd years of science fiction, provide a very nice record of where people thought we were headed, dressed up a bit with noble, selfless, and hyper-competent techie heroes and the women who love them, and the occasional sinister alien menace. It's sort of interesting to look back over the history of the field, and see what people thought the future would be like, what they got right, and what they got wrong.

There's a fair amount of back-slapping in the genre over the successes various authors have had in predicting the future-- Jules Verne predicted submarines, Arthur C. Clarke predicted communications satellites, etc.-- but the successes are not nearly so interesting as the failures (especially since many of the successes require you to bend and twist the original sources to see the success...). It's sort of trite to complain about the lack of flying cars, and anyway William Gibson nailed the Gernsback era better than I ever could, but there are all sorts of little things that people missed. Clarke predicted communications satellites, but I can't think of anyone off the top of my head who really got cell phones, or the Global Positioning System before they existed in the real world. Nobody really picked up on the ubiquity of personal computing until it was a fait accompli, and those authors who did pick up on the idea of the Internet mostly missed its real impact, which has been not so much at the level of the giant multinational corporation, but down at the slob-on-the-street level, providing unparalleled access to information, and even dopey things like online shopping. I was reading a Henry Kuttner short from the early Fifties a month or so ago that even failed to pick up the cultural implications of tv...

Lasers are another decent example, and one near to my heart. At the time of its invention, the laser was famously deemed "a solution in search of a problem." Forty-odd years later, it's the solution to all kinds of problems, many of which the original inventors wouldn't've recognized as problems in the first place. There are a few obvious applications, like the various uses of lasers for cutting metal, and some scientific research, but there's a huge spectrum of ridiculously mundane uses for the things that no-one would've predicted-- telecommunications, CD players, grocery store scanners, all the way down to the laser pointers used in giving talks. And yet we still don't have the laser death weapons that everybody assumed we'd build.

As a general rule, it seems that technological forecasters do a reasonably good job of predicting changes on a very coarse scale, in terms of seeing a use for a technology on the level of governments or huge corporations. For a genre that relies so heavily on rugged individualists for characters, though, SF authors are absolutely miserable at predicting the effects of technology on an individual level. Bruce Sterling notes the difficulty by saying "the street has its own uses for things," which is a memorable phrase, if a little too stylishly grungy to really be accurate-- the predictions don't fail because authors fail to take into account the uses small-time hustlers and petty crooks find for technology, they fail because the authors fail to pick up the uses found by the comfortable members of the middle class.

Another interesting aspect of the process is the way new concepts tend to sweep in, take over everything, and then trickle on out when the real-world technology fails to keep up. Space travel has been a staple of the genre, off and on, for years, often combined with a wildly optimistic view of the technology needed-- there's something incredibly charming about a book like Mission of Gravity, where the author assumes that we'll reach the distant stars while still doing calculations with slide rules. People continue to cling to the idea of space colonization, but the popularity of space stories is nowhere near what it used to be, as it's gradually become clear that space travel is expensive and fairly impractical, and there isn't a really compelling reason to send humans out into space at the moment. Nuclear power is another one-- I recall reading a number of stories from the distant past where people tooled around in nuclear (fission) powered cars and the like, a dream that ran afoul of the nasty properties of nuclear waste (which are over-sold, but still pretty bad). Fusion's another staple, though commercial fusion power remains twenty years off, and is expected to remain twenty years off for the next twenty years or so. Computers and "virtual reality" had a pretty good run for a while as the darlings of futurists everywhere, with half the writers in America apparently convinced that we'd all be spending most of our time in fantasy worlds by now, but that's tailed off as well.

The current reigning champions for Hot Technologies, in science fiction and out, are biotech and nanotech-- twenty years from now, the current line goes, we'll either be replacing our failing organs with cloned replicas or staving off organ failure by means of miniature repair robots in our bloodstream. Given the dubious history of technological prognostication, though, it's hard to really credit these predictions more than the "colonize the Moon with slide rules and mainframes" visions of the past. The vision being pushed, particularly for nanotechnology, is just a little too rosy to be believed, and the problems being brushed aside to get to the imagined marvels of the future loom larger than the techno-optimists would like to believe. The "atom-by-atom" construction robots are a profoundly classical idea (I pick up this atom and put it here, then that atom, which goes there) that would need to function in the microscopic world of quantum mechanics, while the "cellular repair robots" vision tends to rely on a fairly unrealistic extrapolation of Moore's Law (about which more later) to give the robots the necessary processing power. I don't see either of those obstacles being overcome easily.

Which is not to say that nothing worthwhile can come from research into nanotechnology and biotechnology. The failure of past predictions to pan out as advertised doesn't mean that we haven't had marvellous technological advances over the past several decades-- cell phones, the Internet, DNA analysis, GPS navigation, ubiquitous computing (my car has more computing power than the Apollo spacecraft did), and so on. A person from 1950 who managed to pop up in 2002 without passing through the intervening decades would be awed by some of the things we take for granted.

Similarly, the likely failure of current predictions doesn't mean we won't get wonderful things out of the trendy technologies of the moment. They're just not likely to be the things we're told we'll get now.

Posted at 9:30 AM | link | follow-ups | 10 comments

Tuesday, July 30, 2002

Yep. Still Green.

There were summer student research talks at lunch today. During one of the talks, by a biology major, I noticed a URL in a photo credit, which led me to: The MossCam Project. Yes, it's just what it sounds like. It's a webcam pointed at a mossy rock somewhere in California.

Which is not to say that there isn't wonderful science to be done in studying the behavior of mosses growing on rocks in California. But there's something inescapably ridiculous about a site which contains the text "click here to view quicktime movie of moss."

Posted at 8:22 PM | link | follow-ups | 1 comment

Blinded by the Snake

An additional radio note, following on yesterday's post.

One thing I've noticed is that every radio station seems to have at least one signature song that they play again and again, completely out of proportion to the actual popularity of the song. I'm not talking about the latest hits, here-- of course the newest smash from Puddle of Creed is going to get played every hour, like it or not-- but older, minor hits from yesteryear. For instance, somebody at WHFS back in the day had a thing for Robyn Hitchcock, and used to play that "Balloon Man" song at least once a day.

One of the many failings of the Clear Channel near-monopoly is that this gets extended over multiple stations. Between the "Adult Contemporary" station and the "Classic Rock" station, I've heard Manfred Mann doing "Blinded by the Light" more times in the past year than in the preceding thirty. Driving around the other day, I heard it on one station, changed the station to avoid a commercial break, and heard it again on the other. I can't quite figure this out-- it's a decent enough song, though it leaves out half of Springsteen's lyrics, and it was a hit, but it doesn't really rate daily airplay on two stations in the same market. (On the subject of this song, there is a page of amusing guesses as to what, exactly, Mann is singing in the chorus, though some of those just have to be made up for comedic effect...)

The strangest by far, though, is the 80's station. Their signature tune, for no reason I can fathom, is "Union of the Snake" by Duran Duran. Now, I know the 80's. I grew up in the 80's, and while I was never victimized by 80's fashion in the same way as the people the Poor Man talks about, I nodded in recognition through that whole piece.

Back in the 80's, my 80's, every girl in the ninth grade had the screaming thigh sweats for Simon Le Bon. I heard more effusive nonsense about how wonderful Duran Duran was than pubescent male should ever have to endure. And not once did any of those girls speak of their admiration for "Union of the Snake." I can't honestly say that I was even aware, back in the 80's, that they had recorded a song called "Union of the Snake", though I did make an effort to avoid knowing anything at all about Duran Duran.

Who likes this song? What thin-tie-and-kerchief wearing freak is choosing songs for this station? Couldn't we drop this song from the playlist, and replace it with something more pleasant, like, say, four minute's worth of Emergency Broadcast System testing?

Posted at 8:55 AM | link | follow-ups | 9 comments

Internet Anaconda

Here's a boycott I might be able to get behind: Matthew Yglesias is following Atrios's lead, and going "Sullivan Free" for a while.

Well, actually, I'm half tempted to attempt to pick up the Sully-bashing slack, because he does piss me off (see previous entry). But I don't really need the extra aggravation, and Kate certainly doesn't (she's taking the Bar today and tomorrow, and doesn't need to listen to me rant about what a pinhead Sullivan is...). So I'm in.

Hey, Matt, maybe you can use this as a springboard to becoming the liberal Instapundit...

Posted at 8:23 AM | link | follow-ups | no comments

Monday, July 29, 2002

Return of the Pee-Wee Herman Defense

Andrew Sullivan gets huffy about the recent William Safire "blog" piece, wherein the noted language maven wrote:

"Will the blogs kill old media?'' asked Newsweek, an old-media publication, perhaps a little worried about this disintermediation leading to an invasion of alien ad-snatchers. My answer is no; gossips like an old-fashioned party line, but most information seekers and opinion junkies will go for reliable old media in zingy new digital clothes.

Sullivan's response is to deride the reliability of "old media" by trotting out a list of Safrie's errors over the years:

Last February, Safire conceded he had misplaced the context of a quotation by Shakespeare, miscalculated the odds of several politicos, misunderstood the real meaning of "parlous," got the name of a Conan Doyle watchdog wrong, and so on. Nothing wrong with that, and his corrections column was gracious, if far less prompt than most bloggers'.

Of course, to his credit, and unlike some bloggers, Safire actually tells people when he makes a correction to something he wrote...

Is it just me, or is there a little "I know you are, but what am I?" element to this swipe, coming as it does on the heels of Sullivan getting "fact-checked" on his surreptitious editing?

Posted at 1:42 PM | link | follow-ups | no comments

Radio Free Wasteland

In many ways, I was badly spoiled by going to grad school in the DC area. There was decent (not great, but decent) public transportation to get around the city, all sorts of cultural stuff to see and do, a great selection of restaurants where even a Starving Grad Student could afford to eat, and, of course, the Brickskeller where I could blow the money I saved by buying cheap food on expensive beer brewed by Trappist monks in Belgium.

I really miss the restaurants-- there are some pretty good places in the Albany area, but nothing like the variety I had available when I lived in (Don't Go Back to) Rockville, MD-- from my house, it was an easy walk to five of the restaurant on the Cheap Eats list, for chili, Caribbean food, Indian food, or two different kinds of Vietnamese. Another half-dozen were a short drive away. There's just no beating that.

One of the other features I've found myself missing in recent years is the decent selection of radio I had available, particularly WHFS, which was then in its heyday. That was a great station-- they played a good variety of stuff (provided you liked alternarock sorts of things), from Bob Marley to Nirvana, to the occasional William Shatner record late at night, they had a morning show which consisted primarily of music, interrupted occasionally by actual news reports (and the Daily Feed, which was a riot), and the DJ's were amusing but relatively unobtrusive.

Now, I'm living in the Great Radio Desert of upstate New York. There's the inescapable classic rock station, and the candy-ass Adult Contemporary ("the best of the 70's, 80's, 90's and today!") station. There's turgid New Metal touted as "Your New Music Alternative", or turgid New Metal with occasional interludes of decent music, touted as "The Real Alternative" (the only one of these stations not owned by Clear Channel). The one station on the dial that's not depressing for being tediously predictable is the Eighties station, which is depressing because it makes me feel old. And, really, didn't Poison get enough airplay during the Eighties? Can't we please have mercy on the ears of future generations?

All of these stations feature Morning Shows which are Zany to a greater or lesser degree-- unlistenably so in the case of the Classic Rock station-- and make me want to find the station and throttle the chirpy morons they're subjecting me to. (Typical example of a shameful DJ moment: in a story about the National Spelling Bee a month or two back, one of the morning show twits was giggling over the fact that a previous winner had had to spell a word that he pronounced "defecation," though he admitted that he would've gotten it wrong "because of the silent 'i' in there." The word? "Deification.") None of them get far beyond a playlist of maybe three dozen songs.

The lack of decent radio is exceptionally irritating to me because I like to have music playing more or less continually. Bruce Baugh describes the basic idea pretty well: "I have music on nearly all the time when I'm awake and not doing some specific other thing that generates sound. I like it, and it helps keep the flow of my thoughts ordered; otherwise my time sense starts drifting badly." Without music as background noise, I lose focus, and get distracted by other incidental noises that would be covered by the music. Silence drives me nuts.

(Of course, my preference for a constant personal soundtrack drives Kate nuts... You can't win 'em all...)

At home, I can get around this problem by listening to my MP3 collection (yes, I downloaded stuff from Napster, back when it was a going concern, including the Mojo Nixon cover of "Girlfriend in a Coma" that's been stuck in my head all morning...), or throwing a bunch of CD's in on shuffle play. This isn't that satisfying, though, as I like hearing new stuff from time to time. A slightly better option that I've started using recently is the "Digital Music" feature of my digital cable-- I've got umpteen different specialized music feeds via the tv, and they play a pretty good range of stuff.

Neither of these really help me at work, though. In my office, I'm pretty much stuck with the radio, thanks to my lemon of a desktop computer, which doesn't deal well with streaming audio. In the lab, though, I've found a better solution, or, rather, the student who's working with me for the summer has found a solution. It's one of the handful of Web radio stations that hasn't been driven out of business by the ridiculous royalty policy bought lobbied for by the record industry, operating out of Seattle.

It appears to be some sort of public station, as they don't do commercials, and it's everything I'd like a radio station to be (save "in the same time zone"-- it's very disconcerting to hear the DJ's announcing times that are three hours behind). They play a great variety of stuff, from pop-punk to bluesy folk-rock, they hardly ever repeat songs, and the DJ's are calm, soft-spoken, and knowledgeable. They play new stuff, stuff by artists I've never heard of, and material by local bands I'll probably never hear of again. It's a really good station, probably even better than WHFS at its peak (sadly, on my last visit to DC, I found that WHFS, too, has been sucked into the Zany Morning Show swamp of big conglomerate suckitude...)

I'm not much for the anti-corporate, anti-globalization movement-- I think their intentions are good, but they're hopelessly idealistic and naive-- and the "information wants to be free" arguments of Open Source zealots also fail to convince me (I like the response attributed to Bruce Sterling-- "Information wants you to give me a dollar."). I have the same reflexive and contrary reaction to people who assume that all corporations are Evil as I do to people who assume all unions are Evil.

But, when confronted with the contrast between KEXP's web broadcasts and the homogenous crap spewed by the Clear Channel clones available locally... Well, it's hard to think of a better argument in favor of utterly destroying Clear Channel and the RIAA.

Posted at 12:09 PM | link | follow-ups | 5 comments

Sunday, July 28, 2002

Pascal's Wager On Ice

The NZ Bear post I referenced earlier talking about the barriers to "once a week" blogging missed one of the biggest obstacles, because it's really not a technological problem: If you only post once a week, or even once a day, by the time you get around to putting your thoughts on an issue up on the web, everybody else is sick of talking about it, and nobody wants to read a late entry into the fray, however insightful it may be.

This is one of the biggest problems I have with the way I'm running this weblog, and it's going to strike again, here. Last week, in the wake of the Ted Williams story, there was a discussion of cryonics issues over in Bruce Baugh's comments section. I posted one or two things there, and had other comments to make about cryonics and the larger issues of scientific progress, but other things were in the queue, and I'm only getting around to it now. Hopefully, I'll manage to blunder into one or two interesting statements, and get a few people to read this, but I suspect most people who were participating in that discussion have long since moved on to other ideas.

The central argument of cryonics boosters, exemplified by Rand Simberg, is that the forward march of technology is inexorable, and all the ills which presently afflict the human body will be curable in the future. Therefore, you should definitely have yourself frozen upon your "death," even though we really don't have a good idea how to freeze and revive a person, because at some point in the future, we will know..

This level of techno-optimism is sort of charming, but really, the whole thing amounts to a Pascal's Wager argument. Actually, it's worse than that-- it's four simultaneous wagers. By opting to be frozen, you're betting your life on the propositions that:

  1. Future medical technology will be able to cure what's killing you,
  2. Future medical technology will be able to freeze and revive human beings,
  3. Future medical technology will be able to repair any damage done in today's imperfect freezing processes, and
  4. Someone in the future will actually care enough to bother thawing you out.

These are all dubious propositions. Cryo-boosters tend to shrug them all off (well, OK, they generally only deal with the first three) by invoking the wonders of technology. Technology, they assume, will advance without limit ("Moore's Law" is often invoked, but that's a rant for a different day)-- some even suggest that nanomachines will be used to repair freezing damage on a cell-by-cell basis.

I don't buy it-- forecasting technological progress is a fool's game. I can readily believe some elements of the above list, but not the whole package.

To see the problems with wagers 1) and 3), consider the case of someone who died of a progressive, degenerative neurological disease-- Alzheimer's, say, or Parkinson's, or maybe Lou Gehrig's Disease. I can readily believe that future technology will be able to prevent these conditions, or halt their progress, but reversing the process? Those diseases (in my pop-science level of understanding, at least-- corrections from actual biologists and doctors are welcome) cause a steady degeneration of brain tissue, a destruction of neurons, and more importantly, the inter-connections between neurons. And as Charles Murtaugh rightly points out, "a mind is more than a bunch of neurons, it's also the connections between them. Those connections constitute information, which if lost during freezing and thawing becomes irretrievable." Reversing the damage caused by a degenerative disease would require knowing what the brain looked like beforehand, in detail-- what cells were where, and which cells connected to which other cells. Even a cell-by-cell reconstruction by nanomachines would require a blueprint to work from-- every body is subtly different, and every brain is subtly different, and I doubt it's possible to just construct a "generic brain" for someone and have them wake up the same person they were before the brain damage, or before they were frozen.

And this is exactly the sort of thing you're talking about when you invoke cell-by-cell repair of the damage done in the freezing process. Reconstructing the sort of massive cell damage Rand Simberg so blithely brushes off will require essentially this sort of cell-by-cell repair of all the cells and inter-connections in the brain. Doing that without a very clear idea of what the brain looked like before it was dunked in liquid nitrogen and still managing to end up with a reasonable facsimile of what you started with, strikes me as highly improbable. How important the exact cell structure really is, and what the consequences of an imperfect repair would be remain open questions, but these are not trivial issues to be brushed aside without a pause for thought.

There may well turn out to be a way to freeze people and thaw them back out without doing significant and irreparable damage. There may even be a way to limit the damage to a level that can be repaired, or to store a detailed brain map to allow the hypothetical nanomachines to repair the damage. But people who are frozen today, when even the ardent supporters of cryonics basically admit that we don't really know what we're doing, are unlikely to ever wake up in the golden future of their dreams. Technology can do wonderful things, but the information lost when interconnections in the brain are destroyed by disease or freezing is not the sort of thing you can pluck back out of the air, and I don't see how you can repair a brain without it.

The fourth point is also a non-trivial one. Given the level of culture shock that would be involved, it's something of a mystery to me why anyone in a future society would even bother to thaw out someone frozen today. It seems unlikely that people of today would have much to offer to a future society with technology advanced enough to thaw them out, save as some sort of historical curiosity, or lab specimen, which isn't much of a life. You're pretty much stuck hoping that future humans are exceptionally altruistic, or devout libertarians bound by the sanctity of a contract signed today. I don't think I like those odds, either.

This is, at least, a purer version of Pascal's Wager than the original. There's no "many gods" counter-argument that I can see-- the choice is really a binary one between oblivion and a life in the future, with no other options. If the four wagers happen to fall out in your favor, then you win big; if not, well, you're dead anyway. But it's important to recognize cryonics (at least in its present form) for the gamble that it is.

Posted at 1:39 PM | link | follow-ups | 1 comment

Context is Everything

I've been sort of shorting the "pop culture" content of this weblog recently, but here's a pop-culture item:

It was my turn to do the wash this week, and on my way over to the laundromat, I caught the very tail end of one of the umpteen "Weekly Top N" shows that run on Sunday mornings. The top spot on this particular show was Eminem's "Without Me." Now, I'm not the biggest rap fan you'll find, but it's a catchy tune, and the kid does have a way with a rhyme, even if he is an obnoxious little punk. So I left it on.

This being a nationally syndicated show, it was a censored version of the song, and censored to match the candy-ass sensibilities of some Midwesterner who might happen to catch ten seconds of it on his way home from church.. Each of the nine hundred "ass"-es in the song was scrambled (he offers to kick a lot of asses), as well as a "bastard" or two, and I think there's a "fuck" in there somewhere. They also bleepd out "rag" from the phrase "I'm back, on the rag, and ovulating" (or something like that), which is sort of silly, but whatever.

The absolute low point, though, was the verse where he offers to "go tit for tat with" somebody or another. Yep, they bleeped out "tit."

Censorship courtesy of Beavis and Butt-head, I guess. "Heh-heh, heh-heh, heh-heh-- he said 'tit'..."

Posted at 1:03 PM | link | follow-ups | no comments

Great Moments in Opinion Polling

Headline on the front page of the print edition of the New York Times sports section today (in Albany, anyway):

Shea Fans Don't Want Players to Strike

Yes, that's right, people at a baseball game don't want to see a baseball strike. Only in the New York Times do you get this kind of penetrating analysis and investigative reporting.

In future issues, I expect the Paper of Record to reveal that patrons of Tipsy McStagger's (no points for the reference) are opposed to a return of Prohibition, and fans interviewed at a Black Crowes concert were sorta, y'know, like, in favor of legalizing pot, man. Or something. What was the question again?

(To be fair, the column is about a more general (unscientific) survey of fans at Shea Stadium, and can be found online with a more sensible headline (registration may be required), so it's really more of a Great Moment in Stupid Headline Writing. But he did ask the strike question, and did get the "well, duh" response indicated...)

Posted at 9:38 AM | link | follow-ups | no comments

Move Along, Nothing to See Here

Well, I dropped off the "Blogs of Note" list. On the bright side, this probably means we won't completely shatter the bandwidth limits on

Anyway, I'm no longer noteworthy, so you probably shouldn't be reading this. If you'd like to go somewhere else, may I recommend The Library of Babel, where you can find a new entry on Greg Bear's Vitals?

Posted at 9:26 AM | link | follow-ups | 1 comment

ΔxΔp ≥ h / 4 π

My stuff
What's with the name?
Who is this clown?
Does he know what he's talking about?
Archived Posts
Index of Physics Posts
RSS, version 0.91
The Library of Babel
Japan Stories

Δ E Δ t ≥ h / 4 π

Other People's Stuff

AKMA's Random Thoughts
Arcane Gazebo
Arts and Letters Daily
Boing Boing
Chronicles of Dr. Crazy
Confessions of a Community College Dean
Cosmic Variance
Crooked Timber
Brad DeLong
Diary de la Vex
Drink at Work
Easily Distracted
Electron Blue
John Fleck
Grim Amusements
David Harris's Science and Literature Hellblazer
In the Pipeline
Invisible Adjunct
Izzle Pfaff
Knowing and Doing
The Last Nail
Learning Curves
The Little Professor
Making Light
Malice Aforethought
Chris C. Mooney
Musical Perceptions
My Heart's in Accra
Michael Nielsen
Not Even Wrong
Notional Slurry
Off the Kuff
One Man's Opinion
Orange Quark
The Panda's Thumb
Perverse Access Memory
Political Animal
The Poor Man
Preposterous Universe
Pub Sociology
Quantum Pontiff
Real Climate
The Reality-Based Community
SciTech Daily
Sensei and Sensibility
Talking Points Memo
Through the Looking Glass
Unmistakable Marks
Unqualified Offerings
View From the Corner of the Room
What's New
Whiskey Bar
Wolverine Tom
Word Munger
Yes, YelloCello
Matthew Yglesias

Book Stuff

Book Slut
Neil Gaiman
The Humblest Blog on the Net
Pam Korda
Outside of a Dog
Reading Notes
Seven Things Lately
The Tufted Shoot
Virtual Marginalia
Weasel Words
Woodge's Book Report


ACC Hoops
College Basketball (2.0)
Dave Sez
Hoop Time 3.0
The Mid-Majority
Set Shot
Tuesday Morning Quarterback

Δ N Δ Φ ≥ 1 / 2


75 or Less Album Reviews
Rotten Tomatoes
The Onion A.V. Club

Geek Stuff

Annals of Improbable Research
Astronomy Picture of the Day
Britney Spears's Guide to Semiconductor Physics
The Comic Book Periodic Table
MC Hawking's Crib
The Museum of Unworkable Devices
Myths and Mysteries of Science
The Onion
Physics 2000
Sluggy Freelance
Web Elements
Physics Central (APS)
This Week's Finds in Mathematical Physics

Useful Stuff

Web Design Group

While it is my fervent hope that my employers agree with me about the laws of physics, all opinions expressed here are mine, and mine alone. Don't hold my politics against them.

Weblog posts are copyright 2003 by Chad Orzel, but may be copied and distributed (and linked to) freely, with the correct attribution. But you knew that already.

If you use Internet Explorer, and the text to the right cuts off abruptly at the end of this column, hit "F11" twice, and you should get the rest of it. We apologize for the inconvenience.

Powered by Blogger Pro and BlogKomm.

Steelypips main page.