Monthly Archives: July 2010


Blog posts have been sparse recently, mostly just because I’ve been working hard. I’ve been working on a lot of things, especially work: I was in lab until midnight last night. I’ve redesigned my motion phantom setup so that it can run with the scan room door closed. This has major advantages in the department of not getting huge RF contamination artifacts every time a Green Line trolley pulls away from a stop.

Making this work has presented a fun engineering puzzle. The driving motor has to be outside the scan room, and the rope it’s pulling on can only enter through a “waveguide”, a one-foot copper pipe that’s pointed in entirely the wrong direction and already half-filled with electrical cables. I ended up using four K’NEX pulleys and a length of PVC pipe from The Home Depot. None of the pulleys are actually attached to anything, a fact of which I am unreasonably proud. They are all held in place by the tension in the rope. One of them is actually suspended in midair during normal operation.

This is not how I expected to earn my Ph.D.


My apartment is full of leftover public food, some perishable and some not. Much of it dates to roommates who didn’t take everything with them. I hate waste, so I occasionally try to do something about it, by eating it (or getting other people to eat it).

I keep an eye out for confluences of left over ingredients that could align into a recipe, like Connect 4. This week a convergence of eggs (long past their sell-by), oil, and buckwheat waffle mix caught my eye.

I love waffles, and we have a waffle iron, so I made waffles for breakfast on Saturday, which went swimmingly. I started scheming for future waffletunities.

Tonight, we had left over: grapes on the verge of overripe, some unneeded chopped onion, and Korma sauce that someone got for free like a year ago. This became an appetizer (grapes on a waffle), a main course (sauteed onions and Korma sauce on a waffle), and dessert (grapes with cinnamon).

It was a great dinner, and there’s plenty of batter left for tomorrow’s breakfast, with jam. We still have at least 4 jars of jam whose origins are unknown.

Twilight Zone BIOS

A friend and I spent an hour last night fighting with a live USB boot system. We were using the venerable Unetbootin (who names these things?) to turn liveCD ISOs into live USB sticks, then booting them to install to disk. It had worked ten times before, but this time we somehow repeatedly ended up with some wrong, old version on disk. We tried rebuilding the ISOs, with no effect. We thought the ISO creator could have a bug, so we mounted the ISO, but its contents were correct. We thought Unetbootin might be the problem, so we loop-mounted its squashfs, but that looked fine too. We wondered if some subtle error was occurring, so we starting md5sum’ing everything … and somehow the squashfs checksum was perfect, right up until liveboot. On the live machine, after booting, it was different.

We were virtually tearing our hair out by the time we realized that there was another USB hard drive plugged into the system. We weren’t selecting it in the boot menu, of course, but we unplugged it … and everything worked. The bug was in the BIOS, which was booting from basically whichever device it felt like.

Moral: Don’t trust a PC BIOS to boot from the USB device you tell it to boot from. When in doubt, unplug everything.


The Senate has officially agreed with the White House to scrap the Ares rockets, and by extension the rest of Constellation. So far, so good. Instead, they’ve decided to fund NASA to make a Shuttle-Derived Heavy Lift Vehicle (i.e. really big rocket based on the shuttle). They don’t really want this rocket, but they need to build it to keep the money flowing to the huge contractors/employers/campaign donors, because it looks bad if you hand them money for doing nothing. Whatever. Fine.

What’s weird about this, to me, is that Ares V was a Shuttle-Derived Heavy Lift Vehicle … and in fact, so has been seemingly every launcher that NASA has proposed in the last 20+ years. Wikipedia, as usual, has a long list of these proposals, none of which has ever been built.

The specific phrasing of the bill seems to be code for this variant of the idea. It looked sensible enough to me, smarter than Ares anyway … until I realized that they intend to put people on the thing. The whole point of Ares was that the Shuttle was being discontinued for being too dangerous for humans due to the side-mount configuration. This is the same launcher, in the same side-mount configuration. Safety is obviously not a concern, anyway, since they also propose to do a shuttle flight without a rescue/backup on hand.

The contrast with SpaceX is sharp. SpaceX has one engine/tank/rocket design, which they propose to reuse for all stages and all boosters, across a range of sizes. NASA has a combinatorial explosion of different parts from different suppliers, and a political need to use them all at once.

Real Time

Another small milestone today: I got my first demo of live transverse motion tracking working. That means I acquired a training ultrasound data set, processed it to extract the path topology, generated a lookup table mapping ultrasound lines to (topological) positions, and then used that table to determine position from ultrasound data in real time. Still missing is the MRI integration, which is of course critical for the target application.

I’m surprised this worked as well as it did. For this experiment, “real time” required 2000 cross-correlation searches per second on 3000-sample arrays. This was implemented using FFTW’s length=8192 complex->real single-precision iFFT, and a very naive search for the maximum cross-correlation value. Benchmarks showed the hardware could do just over 4000 searches per second. This is a single thread, on a 3 GHz Pentium 4, with the simplest possible C code.

FFTW FTW. I had begun to draw up laborious schemes for accelerating the database lookup, like GPU FFTs or an artificial-intelligence search, but FFTW is so fast that it’s irrelevant. With a newer CPU I might not even need to parallelize to hit my final performance targets.

EDIT: I turned this into a quick benchmark to compare the 3 GHz Prescott P4 with our new dual-quad 2.53 GHz Xeon E5630 (i7). The Xeon is faster by a factor of 2.66 … so like an 8 GHz P4. The i7 is doing triple the P4′s work per cycle. That’s not quite “Moore’s law” for single-thread speedup in 5 or 6 years, but it’s pretty good.

EDIT2: A 2.93 GHz Core 2 Duo (T7500) is faster than the P4 by a factor of about 2.1, which puts the i7′s advantage in perspective.

HPLIP and the DeskJet F4480

My parents bought a scanner for family photos about 4 years ago, an Epson Perfection 3490. It was unused for years, and when we tried to scan in a new album XSane reported an I/O Error and died. The Windows drivers fail in a similarly uninformative fashion. Our best guess is that the lightbulb (a non-user-replaceable cold cathode) died, a situation that the manual hints isn’t handled well by their software.

Next we tried an old Acer/BenQ 620U, but all the SANE utils on Ubuntu (Hardy and Lucid) crashed with Floating Point Exception. Our only Windows machine runs Win7, and the 620U (which has a sticker that says “Get Ready for Win98″) has no drivers for anything later than XP. We also have a SCSI scanner around, but our only SCSI card is ISA. No dice.

So we started looking for scanners. The cheapest one was the Canon CanoScan LiDE 100 ($60), but its support under Linux seemed questionable. SANE appears to support it, but only as of three weeks ago, and only in git head. We kept looking.

Eventually we found the DeskJet F4480 (an “all-in-one”). It’s nominally $80, but universally “on sale” for $60. I wonder if they’re selling it below cost and counting on ink cartridge purchases to make up the difference. We looked into Linux support.

It turns out that the F4480 is supported by the most recent versions of HPLIP, HP’s official open source driver package. HPLIP is packaged for Ubuntu, but there are no backports, and the latest version for Hardy is way too old. Reluctant to push this old machine through a dist-upgrade, I looked through the HPLIP site and found the Installation Wizard, which pointed me to the direct install/upgrade procedure.

The procedure is fascinating. First you download a 20 MB shell script, which is in fact a self-extractor containing a tarball as a string inside the script file. When you run the script it displays a series of simple prompts. First, it autodetects your distro+version and requests confirmation. Based on your answer, it makes calls to your package manager (on Ubuntu via Aptitude) to install a bunch of devel packages like python-qt4-devel and libsane-devel.

Once all the prereqs are installed, the script then invokes ./configure, make clean, and make, which compiles a big heap of C and C++ using the aforementioned devel packages. Once make finishes, the script invokes a graphical installer (perhaps just compiled) to complete the installation.

On the one hand, this is kind of a horrific install procedure. Tarballs hidden in shell scripts, endless special cases for different distros, an installer that’s half text (not even ncurses) and half qt4, and a big compilation on the user’s machine. Clearly the correct solution would be for Ubuntu to backport HPLIP upgrades for LTS releases, so that you don’t have to upgrade the OS just to plug in a new scanner.

On the other hand, this is a beautiful example of the power of open source. You are free to build an installer any way you like. No one can stop you; it’s between you and your user. You can freely download any prerequisites. Because the source is open, you can compile it on the user’s machine, and not worry about compatibility with different library versions or even different CPU architectures.

The scanner, by the way, is working perfectly, and the images look great. Sometimes ugly is beautiful.

Sign Language Music Videos

There’s an amazing phenomenon occurring on YouTube, just beneath the surface of the zeitgeist: a profusion of music videos in American Sign Language. Specifically, these are videos where someone has taken the lyrics of the music and translated them, with some poetic license, into ASL, then performed it in sync with the original music.

I’m by no means the first person to notice it; the net is brimming with articles about it. Perhaps the most famous performer now is Ally ASL, who has virtually gone mainstream due to the success of videos like this one. She’s far from the only one though; there are literally dozens of different ASL videos just for the song Party in the USA (a song I know rather better than I’d like to admit).

The phenomenon is remarkable, of course, because to make such a video, you need to be able to hear. Most of the producers, it seems, are students or aficionados of ASL as a second language, although some may be deaf people working with a hearing partner. The YouTube comments suggest that many of the viewers are deaf, but by no means a majority. After all, without the music it’s surely a very strange thing to watch, a poetry recital whose constraints are unknown and invisible.

In truth, these videos make most sense in the middle ground, for the Hard of Hearing who hear some of the music, and are bilingual with ASL but find signs clearer than words. It is perhaps a niche form then, targeted at those who are neither entirely hearing nor deaf … and so a good example of the Long Tail in action.

Nonetheless, it’s impressive, and with hundreds or thousands of performers over the past two years clearly a major trend. I personally find the videos really fun, and a bit habit-forming, despite having zero understanding of ASL. I can hardly think of another art form that engages so much in the way of multilingual poetry, music, and energetic intelligent dance.

I wonder where it goes from here. Personally, I’m hoping for combination with advanced audio visualization (see the music!) and a hot new dance trend (sure beats the Soulja Boy!).


Humans can survive when the ambient temperature exceeds core temperature by using evaporative cooling (i.e. sweating) to cool the body below ambient. This only works if sweat evaporates though, which requires less than 100% relative humidity. I’ve often wondered: what happens if the temperature is above ambient and the humidity is too high for evaporation?

The last three days have provided a great opportunity to test this scenario, and then answer is “misery”. Unfortunately, it hasn’t been a perfectly fair test, since I’ve been drinking multiple liters of chilled water in the span of a few hours.

In contrast to my apartment, my office is so cold that I wore a flannel shirt at my desk today.


I am 25 years old. Everyone asks how it feels, and I’ve been contemplating that occasionally, as I spent the day doing chores around the apartment. It feels like I am no longer ahead of schedule. No longer especially young to be doing what I’m doing. I’m on track, like a train that’s due to arrive at the prescribed time.

There are worse things.