Rory provides some insight into what goes on inside of your HDTV, and exactly what role Oompa Loompas and Hobbits play in displaying your picture:
My own take on the 720p vs 1080i debate:
720p has an addressable pixel resolution of 1280x720. If you want to show 60 frames a second of this resolution in a progressive mode, then you need to have bandwidth capable of addressing 1280*720*60 => 55,296,000 pixels per second.
1080i has an addressable pixel resolution of 1920x1080. If you want to show 60 frames a second of this resolution in a progressive mode, then you need to have bandwidth capable of addressing 1920*1080*60 => 124,416,000 pixels per second.
But, if you interlace the 1080's resolution, then you only need to have bandwidth capable of addressing 1920*540*60 => 62,208,000 pixels per second.
So, for roughly the same bandwidth, you can have a higher resolution image, but it takes twice as long to load each complete frame. This is great for still shots, but when you have fast movement on the screen, you can have motion artifacts (I suppose that the quality of your monitor's deinterlacing abilities will dictate how much artifacting is perceived by your brain).
However, 60 fields per second (30 complete frames per second) is not a terrible thing, though. That's what you see when you watch normal television (NTSC). Do you notice motion artifacts? Probably not, unless you're standing 1-foot away from a 27" television set. You see, your eyes and brain tends to merge the fields together into a fluid motion picture. It also helps on regular television sets that the resolution is lower so that individual pixels tend to blur with their neighbors, which in turn helps to hide the combing (motion artifact).
Now I don't own a HDTV set, but I've stared at them in Best Buy on occasion, wishing that Santa Claus would just mistakenly load an extra one in his sleigh, realize that fact while in-flight, and just leave it at my place. And here's what I can say to people debating 720p vs 1080i: Waah, waah, waah.