As If By Satanic Magic
Cloer: About three pills ago I would've been weeping and shrieking right here in front of everybody. What about refinements to last week's ideas?
Koo: Oh we totally beefed up the broom.
Sinagra: The broom is tricked out.
Cloer: Good, because I'm not going to sit down with Mitch and say, 'Here, it's a broom. We're going to sell these little bastards a broom, eerily similar to the one currently sitting in their pantry.'
It's about as plausible as any other explanation you might come up with...
Back in the (Metaphorical) Saddle
The comment problem has been fixed, and a couple of comments that were lost to the permissions change have been restored.
Just so this isn't a completely worthless post, I'll note that I really liked the (much-linked already) Josh Marshall article about trust and the seemingly inevitable "war" with Iraq (scare quotes in honor of the late, great Bill Hicks). He articulates the central problem with the war and its backers very nicely. This is, of course, why he lives the glamorous life of a freelance journalist. Or something.
I also rather enjoyed Andrew Northrup's bit on truth, journalism, and True Belief. Mark Steyn is one of those writers beloved of the "blogosphere" whose appeal I Just Don't Get, and this bit is a good indication of why.
Experiencing Technical Difficulties
The Web host for steelypips.org has recently shifted us to a different server, and in the process, something got messed up with the comments feature. It'll be fixed later, but just isn't the highest priority at 8:00 am on a school day.
America's Best Colleges (For Some Values of "Best")
There was an interesting article in the Washington Post this past weekend about how colleges and universities have started lobbying each other in an effort to improve their standing in the US News rankings. It's sort of amazing to see how much importance is given to a system that, in terms of making an objective measure of the "best" college, has about as much credibility as college football's BCS rankings. Then again, people seem to think the BCS means something, so maybe it's not surprising...
The central issue is the fact that "Academic Reputation" (or, now, "Peer Assessment") is the single largest factor in the rankings. This qualitative ranking counts more heavily than the quantitative measures of student quality (SAT scores, class ranks, etc), faculty quality (student/ faculty ratios, percent of faculty with PhD's, class sizes) or college resources (per-pupil expenditures, alumni giving, etc.).
The "Peer Assessment" rating is obtained by the incredibly scientific method of sending a letter to the presidents, deans, or chancellors of colleges and universities across the nation, and asking them to rank the other schools in their category. That's it. They ask every college president in the country "What do you think of these other schools?" and that counts for 25% of the overall score.
This is, of course, a system that seems perfectly reasonable to the people at the top (and I didn't hesitate to brag about it when my alma mater was in the top slot of the liberal arts college rankings, as it was during my junior and senior years...), but really, what this amounts to is "Where were you on the list last year?" It's a little hard to really take this seriously.
And yet, colleges bend over backwards in an effort to climb in the rankings. As mentioned in Admissions Confidential, admissions offices face pressure to boost the "selectivity" rating, seeking more applicants and fewer acceptances-- to the point where exceptional students are sometimes rejected on the grounds that they're not likely to enroll if accepted, and would hurt the selectivity. The alumni mailings flooding my mailbox tout the US News rankings as a reason for their begging-- the rankings include both total money raised and percent of alumni contributing, so a school that routinely receives gifts of millions of dollars doesn't hesitate to ask Starving Grad Students to send twenty bucks their way. Six-year graduation rate is a rankings category, which explains why it's really hard to flunk out of a top school-- struggling students and discipline problems are suspended for one year rather than being kicked out, and colleges will turn back flips to find a way to let shaky students graduate. (To be fair, there's a financial motive for this, as well-- if one of your "C" students is going to turn out to be the next Bill Gates, or a future President, you'd like them to remember you warmly...)
Weirder still, the colleges at the top of the list often seem to be regarded as qualitatively different than schools lower down. It's a strange feeling to hear people speak of Williams and Swarthmore as being off in some sort of unattainable region of the educational stratosphere-- my memory's far from infallible, but I don't really recall my friends and I being all that much more dedicated to educational pursuits (or much less drunken and loutish) when we were in school than my current students are.
The weight given to these bizarrely-subjective-but-masquerading-as-objective rankings is even stranger when you consider what a personal thing college really is. Overall academic quality and quantitative resources are a reasonable indicator of the quality of a school, but really, the effect they have on the outcome is dwarfed by other, less quantifiable, factors, ranging from the character of the individual students to the "culture" of the school and whatever groups a given student ends up with. The fact that I played rugby in college had almost as much influence on what classes I took and what I learned as the fact that I majored in physics. Sophomore year, rugby probably had a bigger effect than physics...
Ultimately, the US News rankings are about as useful as, and probably inferior to, the incredibly scientific and utterly objective Uncertain Principles Rankings, presented here in their full glory (unlike the weasels at US NEWS, I won't make you pay to see the full rankings, though if you'd like to send me money for college advice, I won't turn it down...):
All the others are crap, especially Amherst.
Oh, You Meant the Philadelphia Eagles?
Of course, I should also mention that there was happier news in the Post this morning, what with Steve "Ego So Large It Has An Event Horizon" Spurrier getting a rude welcome to actual professional football (as opposed to the football-like substance offered by the Arizona Cardinals). The Washington Posts's national news coverage is good enough that the "local rag" aspect of its sports coverage gets rather irritating (at least during baseball and football seasons). It's worth the "All Redskins, All the Time" hassles for the wailing and gnashing of teeth when they get absolutely pasted like they did last night. With both the Pats and Giants winning on Sunday, it was a good sports weekend-- the only thing that would've made it better was a Dallas loss, but you can't always get what you want...
It was also another shining moment for the Prince George's County police department, who managed to accidentally pepper-spray the Eagles' bench. They haven't really gained in cluefulness since March. Or the time a few years back when a nut in College Park holed up in his house with a gun, and they sent a tank in after him...
Great Moments in Depressing Stupidity
The Bush administration has begun a broad restructuring of the scientific advisory committees that guide federal policy in areas such as patients' rights and public health, eliminating some committees that were coming to conclusions at odds with the president's views and in other cases replacing members with handpicked choices.
It's idiot moves like this that prevent me from buying the idea that Bush or his handlers were actually strategic masterminds who maneuvered the UN into accepting their goals for Iraq (leaving aside the fact that the strategy in question is essentially Kissinger's "Mad Bomber" theory transported thirty-odd years into the future). I think they really, truly intended to go it alone on Iraq, and were honestly flummoxed by the fact that people didn't simply fall in line behind their plan.
We've seen this again and again, from "faith-based programs," to the arsenic-in-drinking-water fiasco, to the stacked energy policy committees, to the laughably biased stem-cell ethics panel (which avoided being even more outlandishly slanted only because of public skepticism), to the now-you-see-it-now-you-don't department of Homeland Security (still an awful name), to the make-it-up-as-you-go policy on military tribunals for terror suspects. They do something mind-bogglingly stupid, and when people object, they fumble around confusedly as if they had never considered the possibility that anybody might not agree with whatever the idiotic scheme of the moment was. Given their obsession with secrecy, I shudder to think what sort of executive-order idiocy has been perpetrated out of public sight, and will only be revealed once the current pack of buffoons has left office.
And now, the circus comes back around to the sciences, in what may be the most depressingly stupid and anti-intellectual act since Kansas tried to will evolution away. It'd be funny if it weren't so predictable.
We've been told over and over that one of Bush's great strengths is his business experience. Sadly, he seems intent on bringing to Washington all the very worst practices of big business, starting with the Enron accounting of his tax cut (and its many and changing justifications-- surely it's only a matter of time before they spin the latest tax scheme as a way to cure cancer...), and now moving on to the "If you don't like the research results, buy some new scientists" approach pioneered by the tobacco industry.
For the benefit of the two people who read this but haven't already followed the Diary de la Vex link, well, go read it. If you don't, you're missing great stuff:
O, filet mignon--my dark mistress.
It is perfectly sad how much I like the diminutive steaklette. When I give up meat for good one day, I will cheat twice a year with filet mignon, and it will achieve all the romance of "An Affair to Remember", only with a strip of pink in the center and far less poignancy.
She doesn't update all that often, but when she does, it's like the best of the oft-linked Lileks's family stuff (if he were considerably younger, unmarried, childless, and female), without the annoying political detours.
Crisis of Schadenfreude
Sidney Nagel at the University of Chicago (who I don't know, but Pam probably does) has an opinion piece in Physics Today (warning: they have an incredibly annoying ad server that sometimes causes the page to hang up while loading in Opera-- the text is about 36 kB, so if it's gotten that far, hitting "stop" will let you read the article) solemnly declaring that "Physics is in crisis." The field is riven by divisions between disciplines, "diminish[ing] our sense of a common mission," while
At the same time there has been tremendous excitement in other fields such as biology and computer science. Those fields have now outstripped physics in terms of excitement in the public eye, and we have lost students to those disciplines. This loss makes us uneasy and less confident of the value of our own research areas. While competition for funding has always been stressful, we feel overlooked as more funds are delivered elsewhere (such as to the health disciplines).
He suggests that this critical problem should be attacked in two steps. The second of these is eminently reasonable:
#2) Answer honestly why someone from outside your subfield should be interested in what you are doing. Then, give those reasons clearly in all your talks
(emphasis in original). Crisis or no, that's sound advice, and extends even between disciplines-- physicists giving public talks should articulate why chemists, biologists, and economists should care about their work. Everybody ought to be doing that already, and to the extent that some physicists aren't explaining why other people should care, that's a symptom of a deeper problem than graduate student disenchantment.
His first suggestion, however, needs some work:
#1) Create two symposia for the next March meeting (primarily devoted to condensed matter and materials physics) of the American Physical Society that bring together condensed matter and particle physicists to discuss science of common interest. One of these should be on an experimental topic and one should be on a theoretical topic.
Now, he does allow that, "[f]or extra credit" one could "do this same assignment linking other pairs of disciplines." But the initial formulation is telling, and shows the symptoms of exactly the sort of internal division he's decrying. Despite the occasional claims of particle physicists, there's more to physics than just particle physics, or just condensed matter physics. Nagel correctly notes that the problems faced by the field are a result of the over-identification of physics with high energy particle physics (which may legitimately be in crisis), but this is not improved by expanding the too-narrow definition to include only two sub-fields, even if they are the most populous sub-fields in physics.
The basic idea is good, but it shouldn't be restricted to one pairing-- if you're going to put together a symposium on connections between condensed matter and other fields, cover a range of fields. And it should be noted that it's actually much easier to make connections between condensed matter physics and fields other than particle physics-- a lot of what's going on in BEC research these days has great significance for condensed matter physics, and it's easy to come up with ways for condensed matter to influence biophysics.
As for the larger question, it's not true that all of physics is in crisis. This is probably the most exciting time to be working in atomic and molecular physics since the dawn of quantum theory. Laser spectroscopy, laser cooling, and Bose-Einstein Condensation have completely revolutionized the field, and revitalized a sub-field some thought to be moribund. Quantum optics and quantum information have likewise taken off in recent years, aided by some of the same experimental techniques. Biophysics, whether you consider it a subset of "soft condensed matter" or not, has exploded in recent years, as new experimental techniques and the exponential increase in computing power have opened up new areas of study.
If there's a crisis in physics, it's a crisis primarily on the Big Science end, where the mega-collaborations of particle physicists work. And it's a crisis of their own making-- they led the charge to define physics as only particle physics, and even to define particle physics as only the search for ever more exotic particles at ever higher energies, demanding ludicrously expensive accelerators and colliders to search for fundamental particles which are only interesting because they are fundamental.
That branch of physics has hit a bit of a wall, especially in the US, due to a combination of the increasing difficulty and expense of the work, and the realization by the people with money that the useful output of the field hardly justified the expense. The crisis was started by the loss of funding for the Superconducting Super-Collider, and its spread is a product of the SSC hype-- if funding for SLAC and Fermilab is in danger, that's partly because so much time was spent proclaiming that only the SSC could lead to future progress, that SLAC and Fermilab had had their day. Having decided that the Next Big Device isn't worth the cost, it's not unreasonable to question whether it's worth funding devices whose usefulness was questioned by the very people seeking the funding.
In general, I agree with Nagel's main point, that a more inclusive definition of "physics" would be nice, though I would argue that such a definition has been in place for a long time in the smaller sub-fields-- atomic physicists and biophysicists weren't the ones trying to divide all science into "particle physics" and "stamp collecting."
Having adopted such a definition, though, the question of the magnitude of the "crisis" becomes open to debate. Total physics budgets are dropping, yes, but the cuts are coming mainly in the gargantuan budgets of the Big Science projects. Funding in other sub-fields is basically unchanged, and may have risen. Graduate enrollments are dropping, true, but again, that's partly because the demand for more bodies in the mega-collaborations is dropping. Smaller projects in other fields can get by without huge numbers of students, and better use could be made of the students working in the big collaborations. (A bigger concern is the quality of the students who remain-- are the best students leaving for other fields, or are they sticking around while the plodders and dilettantes depart? That's a harder thing to judge.)
I'm inclined to think that an absolute decrease in the funding and graduate students for all physics sub-fields isn't necessarily a bad thing-- a somewhat smaller total pool of money, distributed more evenly over the sub-fields would probably be a healthier situation than the glory days of Big Science that some people want to return to. In practical terms, you get more useful science out of a million dollars spent on smaller-scale projects in atomic physics, "soft condensed matter," or biophysics than you do from a hundred million dumped into bigger accelerators. Yeah, that would require a dramatic change in the way particle physics is done, but the historical dominance of particle physics wasn't really sustainable anyway.
And realistically, I'd have a hard time making a serious argument that there's anything wrong with shifting some funding from physics into the "health disciplines." In any rational worldview, medicine is a higher priority than just about any field of physics. Physics isn't going to cure cancer or find an AIDS vaccine, no matter how much money we get.
Is there a crisis in physics? I suppose so, in the sense that the field can't really hope to continue doing exactly the same things it has for the last umpteen years-- in that respect, it's like the crisis facing the entertainment industry, though the analogy fails due to the lack of a physics equivalent of the RIAA or MPAA. As with the crisis in entertainment, though, it's not clear that the crisis will necessarily have bad results for the field as a whole, or society as a whole.
And it's not a crisis of the sort where the very survival of the discipline is threatened. For the foreseeable future at least, we'll always need to have people doing the things that physicists do-- studying the fundamental properties of matter, looking for the basic rules underlying the behavior of the universe, and trying to find and exploit the interesting consequences of those rules. We may not need quite as many of them looking for the Upper-Left-Hand-Corner Quark, but to say that physics in general is ailing would be foolish.
All Horses Have An Infinite Number of Legs
Here's the long-threatened introduction to relativity. This is more or less what I said in class on Friday, with a bit less math, and adapted for the Web.
As noted in an earlier post, the knowledge that ought to be picked up by a student in the first two terms of a physics curriculum is roughly comparable to the state of the art in the late nineteenth century. They're lacking a little bit of statistical mechanics and thermodynamics, and a good deal of math, but the basic ideas they get in the first two terms (mechanics and E&M) are the solid core of physics as it existed in the late 1800's.
That covers quite a lot of stuff. The key ideas of mechanics-- Newton's Laws, and the principles of conservation of energy, momentum, and angular momentum-- tell you more or less everything you need to know about the motion of everyday objects (even up to things like stars, planets, and galaxies). Those ideas are essentially the codification of common sense, and the intuition you gain about the way the world works just by living here. There are some surprising consequences, particularly in angular momentum, but generally, the laws of mechanics behave in what everyone would agree is a perfectly rational manner.
The other large chunk of material students get in the first two terms of physics covers electricity and magnetism. Understanding the nature and behavior of electricity and magnetism, particularly the unification of the two in Maxwell's Equations, was the crowning achievement of nineteenth-century science. From Maxwell's equations, you can understand and predict everything you need to know about the behavior of charged objects, electric currents, and magnets. Maxwell's equations even allow you to understand the nature of light-- light is made up of oscillating electric and magnetic fields, propagating together through space.
Putting these two together, you can explain all sorts of things, and physicists of the late 1800's were justly proud of their accomplishments. Some of them were sufficiently convinced of the power of these theories, and the essentially rational nature of the universe, that they made cocky statements about being on the road to knowing everything, once they could get the fiddly little details worked out.
Those statements, like most such predictions, were destined to be thrown back in their faces for years and decades to come, because while the achievements to that point were impressive, in reality, the wheels were about to come off for classical physics. Mechanics and Maxwell's equations are completely incapable of describing phenomena on an atomic scale, and experiments were being done in the late 1800's that would lead to the revolution of quantum mechanics.
But there was another problem as well, which amounted to a fundamental conflict between these twin pillars of classical physics. It has to do with light, and in particular the speed of light.
As noted above, Maxwell's equations allow you to describe light as a combination of oscillating electric and magnetic fields. They also allow you to calculate the speed of light, generally designated "c". It's a really big number-- 299,792,458 meters per second, or roughly 186,000 miles per second for those who think in English units. Light moves awfully fast.
It's also a single number. When you calculate the speed of light from Maxwell's Equations, it's a very simple function, depending only on two universal constants. There's nothing in there that depends on the frequency of the light, or the nature of the source, or the speed of the source.
But that's where the conflict arises-- common sense tells you that light should travel at different speeds, depending on how fast the source is moving. If I'm riding in a pick-up truck moving at 10 m/s, and I throw a rock at 10 m/s at a sign on the side of the road, the speed of the rock when it hits the sign is 20 m/s: the sum of the speed at which I threw the rock, and the speed at which the truck was moving. In a similar vein, if I'm riding in a rocket ship moving at half the speed of light, and switch on my headlights (it's dark in space, after all...), a stationary observer should see the light moving at the speed of light plus the speed of the rocket, or one and a half times the speed of light. And yet, Maxwell's Equations say there's only a single value.
This was a major problem. Maxwell's equations were the crowning glory of nineteenth century physics. Nobody wanted to have to discard them, and they work wonderfully well in other situations. And yet, they're in direct conflict with the laws of mechanics, and basic common sense, and nobody wanted to discard either of those. Classical Mechanics was also incredibly successful, and you've got to believe that common sense counts for something, otherwise, we might as well go back to the days when we huddled in the rain around camp fires and made up stories about angry gods who like to see us suffer.
People came up with all sorts of baroque tweaks on the basic structure of Maxwell's equations in an attempt to resolve this conflict. Some of them sort of work, but they all fail on one level or another, and none of them are very satisfying.
What was needed, as with any conflict between two competing theories, was an experiment to resolve the issue. Somebody needed to measure the speed of light, and determine whether it really does depend on the motion of the source. The problem is, the speed of light is 299,792,458 m/s-- even today we can't get anywhere near that speed. The fastest-moving man-made object is probably the Voyager I spacecraft, heading out of the Solar System at 17,400 m/s, and that's after a few laps around Jupiter to pick up speed. As for getting something up to a reasonable fraction of the speed of light in the late 1800's, forget it.
What's needed is a combination of the fastest-moving object around, and a very clever measurement technique. The fast-moving object is the Earth, which orbits the Sun with an average velocity in the neighborhood of 30,000 m/s. That's still one ten-thousandth of the speed of light (give or take), but it's about as good as you're going to get. And given a sensitive enough measurement technique, that's enough to detect.
The measurement technique is light interferometry, developed by Albert Michelson. The device he used, now known as a Michelson interferometer, consists of a beam of light split into two beams which are sent off at right angles to one another, then brought back together and re-combined. When the beams from the two arms are combined, they interfere with one another, giving you a pattern of bright and dark lines. If you do something that changes the round-trip time for one of the beams-- lengthening one arm, say-- the pattern of lines will shift in a predictable manner. An extremely large Michelson interferometer (with arms kilometers long) is the center of the LIGO (Laser Interferometer Gravity-wave Observatory) project, looking for small wiggles in the fabric of space due to cataclysmic astronomical events.
Another way to change the round-trip time, of course, is to change the speed of light. If light moves faster in the direction that the Earth is moving, as common sense says it should, then the pattern of lines should be different than what you'd get if the speed of light were a constant. If you rotate the interferometer, so that first one arm, then the other, is aligned along the Earth's motion, you should see a back-and-forth shift of the pattern.
That's what Michelson did, with his grad student (and famous Williams alumnus) Edward Morley. They built an interferometer on a great big slab of granite, floated the slab on a vat of mercury (OSHA wouldn't've approved of lab practice back then), and spun it around. They did this during the day, at night, at all different times of the year, and got the same answer every time: there was no shift. The lines never moved, even though theory, and common sense, said they should.
So, Maxwell's equations are right. Light moves at one, and only one speed, independent of the speed of the source. Common sense, it turns out, is wrong in this case. So where's the mistake? Is common sense just wrong? Are we living in a Ted Chiang story, where 1=0?
It turns out that the common-sense argument is like one of those joke proofs that 1=2. They look convincing for a little bit, but there's always a step in there where you divide by zero, and all bets are off after that. It's not a divide-by-zero error, but there's a subtle mistake in the argument that leads to the conclusion that light should move at different speeds-- a mistaken assumption so basic that most people don't even notice making it.
That's where Einstein comes in-- he spotted the mistake, and showed that fixing it also fixed all the other problems with the theory. Of course, it also required modifying the laws of mechanics, but then you can't have everything... What the mistake is, and how Special Relativity fixes it, will have to wait for another post.
Advice to New Faculty
The new term started this past week, with a larger than usual crop of eager young freshmen turning up Monday (the 9th) for orientation before classes started on Tuesday. They're very young (born in 1985-- they almost certainly don't remember Reagan being President, and they're probably hazy on Bush the Elder), and very eager, and it generally looks like a good crew.
Before they showed up (the previous Thursday), there was another orientation, this one for new faculty. A little over a year ago, I was one of the new professors wandering confusedly around the campus (new faculty are easy to spot-- they obviously don't know where anything is, but they're too old to be students, and not dressed well enough to be parents of students...); this year, I was one of the second-year faculty asked to speak at the new faculty orientation.
Most of what I said wasn't really general-interest material, being a little too specific to the small liberal arts faculty thing. It's really not worth posting advice on syllabus construction or classroom discipline to the internet (though as a general comment, The Poor Man has a good guide to how not to conduct a class...). But a few of the things that came to mind were probably worth a blog post. Most of them can be condensed into two statements: 1) This Job Can Eat Your Life, and 2) That's Not a Bad Thing.
My stock response for most of the people who would ask me how I liked my first year as a college professor was "It's a lot more work than it looked like when I was a student." Which is true-- until you actually have to teach a whole class, you don't really appreciate how much work goes into it. In retrospect, it should've been obvious-- I probably spend at least four or five hours prepping a one-hour research talk, and a lecture class is essentially three of those a week. And that's before you get into the issues of choosing demos, setting up demos, writing exams and quizzes, grading exams and quizzes, choosing or writing homework problems, grading homework, posting homework solutions on the Web, setting up labs, grading labs, and answering questions at odd hours (it really doesn't matter what time you designate for office hours-- in fact, posted office hours are the least likely time for students to show up-- students will turn up at all sorts of strange times, expecting help on the homework). Teaching is a time-consuming job, and that doesn't even get into the question of doing research.
Even beyond that, though, it's a job that preys on your mind. When the term is in full swing, I find myself thinking about classes more or less constantly-- making up exam questions while doing laundry, re-thinking the structure of a lecture while I'm in the shower in the morning, or planning labs and demos in the produce aisle at the supermarket. It's an incredible distraction from other concerns-- wet clothes sit forgotten in the washing machine, I realize in the car on the way to work that I forgot to shave, or I get home and find that I forgot to buy any actual food when I was at the store. Some of this is just my natural tendency towards obsessiveness, but a good chunk of it is really just part of the job. It's sort of like blogging is for some people-- last winter, when I was teaching mechanics for the first time, I got to the point where I'd see a car skidding off the road in front of me and think, not "Ice!" or "Gee, I should see if they're all right," but "I could use that as an example in class..."
A colleague who stumbled on my book log expressed amazement at how much I was reading during the term, but it was really necessary to maintain sanity-- I had to read fiction for an hour or so at night to reset my brain, or I'd lie there in the dark plotting lecture notes, unable to sleep. It's really fairly pathetic.
The last three paragraphs make teaching sound like a bad lot, but that's the amazing part-- the job will eat your life, but surprisingly, that's not a bad thing. You end up pouring all kinds of effort into this stuff, and other less pleasant stuff, but somehow it all pays off, in the damnedest ways. I had a couple of students in one of my classes who were a gigantic pain in the ass, and there was a bad stretch in the middle of one term when they were disrupting the flow of the class, and generally making me miserable. And in the middle of that, another student in the class went out of his way to drop by my office and apologize for cracking wise once, because he'd been really fired up about getting a good grade on a mid-term. It sounds corny, but that made my whole week.
There are two things that I always manage to forget between terms (at least, for the three between-term breaks I've had thus far). One is just how much energy lecturing takes-- the first few classes of a term, I'm always a little hoarse towards the end, and absolutely ravenous when I get out of class. And Friday, after two partial classes and the first full lecture of the term, I was an absolute zombie when I got home (playing hoops for an hour at lunchtime didn't really help, either).
The other thing I forget is what a kick it is when a class goes well. I'm starving at the end of a good lecture, but also more than a little bit hyper. It's hard to resist the temptation to provide a comprehensive recap to other faculty members, the department secretary, Kate, or random passers-by. Some days I don't resist.
One of the new faculty members this year in another department had worked in theater, and made a comparison between teaching and acting. There's something to that-- lecturing is an odd sort of theater, part scripted, part improvised (I re-invented a couple of bits of Friday's lecture on the fly), and with the same audience every time out. It's not exactly a death-defying spectacle, but you do have to put yourself on the line to some degree-- if you put the lecture across well, maybe they'll learn something, but if you blow it, particularly early in the term, they'll decide you're simply the biggest idiot on the planet, and you'll never really recover. When things go well, and you hold their attention, and the on-the-fly re-writes work out all right, it's an incredible rush.
A lot of things about this job are a pain in the ass-- you're given limited resources and impossible tasks, and if you succeed, you get asked to do even more. And general faculty meetings can occasionally make Dilbert look soothing and idyllic. But all in all, I love my job.