Do not sweat. 1080i is actually better then 1080p. It sounds counter-intuitive, but you have to look at the second part of spec - number of frames per second to understand why. The only 1080p standard supported by Blu-Ray HD spec is 24fps vs 60fps for 1080i. In other words 1080i signal carries 20% more information (60fps divide by 2 divide by 24) then 1080p.
David, on any 1080p set bigger than 40 inches you will definitely see a difference from 1080i to 1080p, especially if you are using good video cables and an above average TV--if not you need to see an eye doctor. Netscorer, you are incorrect in your statement. 1080i is NOT visually better than 1080p. If that was the case, then all our blu-rays would be 1080i (99% are 1080p). The difference is the source material. Movies shot on film are transferred to disc at 1920 x 1080 pixels progressive (that is, every horizontal line recorded) at 24 frames per second. Anything shot with a video camera is (can be up to) 1920 x 1080 pixels interlaced (that is the first field will have the odd lines, the next the even ones) at 60 fields per second. So, the 1080p discs have more information because every "frame" has all 1080 lines in it. A "field" of a 1080i disc has only 540 (every other.) However, remember video (1080i/60) only ever captures ever other line. Half of each "picture" is thrown away. Recovering this on a digital display device requires either some clever thinking on the part of the electronics to guess what was missing and fill it in (usually by comparing 3 or so fields to see what changed) or worse, simply stretching each single-pixel row to two rows. This, in effect, halves your vertical resolution. Either way, deinterlacing errors are quite visible and motion is much worse with jaggies, moiré patterns, judder on pans, zooms and moving objects, etc.). Film, on the other hand, has no missing information. So because you have more bandwidth, does not mean you get a better looking picture, and in this case you definitely do not, and that is why the industry has chosen the 1080p 24 frames per second as the standard for blu-ray and in our above average TVs.
industry has chosen 24fps because it was long accepted Hollywood filming standard, not because it is a superior format. Human eye requires at least 30fps to perceive a smooth moving picture. So, while you describe the potential issues with interlacing presentation more or less accurately, it would be nice if you present the whole picture, not just half of it :-). BTW, what you forgot to mention, is that even though i-format throws away half of the vertical resolution, it does it alternatively for even and odd lines, so within any given 1/30 of a second you do have full 1080p. And the algorithms that allow TV (or receiver) to fill in the blanks are constantly getting better, so it is really not as bad as you describe.
Actually, if you have an above average TV and/or Blu-ray player, there is *less* chance you will see difference between 1080i and 1080p. This is because good video processing like that found in the Oppo BDP-83, Panasonic DMP-BDT350, and the like can resolve 1080i fully to 1080p with zero resolution loss as per the S&M test & HQV test discs. If you have a lesser player like the PS3, Sony BDP-360, Panasonic BD65, Samsung, etc, then you may run into res loss with 1080i.
Except if you're running 1080i, then it's your TV that's doing the de-interlacing. The player is taking the 1080p/24 source material and converting it to 1080i/60 for older plasmas/LCDs. If you have a good TV, it will have a good de-interlacer that will process the video well. This also reminds me about the debate of 1080i vs 720p: where 1080i must have better resolution, but 720p has better motion. Well I have an older Panasonic plasma as one of my TVs, and I've noticed it processes 1080i better then 720P (in both sharpness and motion).
As for TV shows, a lot of them are mastered in 1080P: even if some of the digital cameras used actually capture in 1080i. I'm not sure about Life, but I know Earth was mastered in 1080p: even though there were 720p, 1080i, and 1080p cameras used. The main logic about the studio sticking with 1080p is that they would have nice, expensive video processors to de-interlace and scale the image.
I agree with everything but "good quality cables". Buy amazon brand HDMI cables. There is absolutely no difference between $80 HDMI cables & $2.99 cables. Digital information is transferred through HDMI and cannot "deteriorate" like what happens in speaker cables (even then, you could buy cheap speaker cables from monoprice and still not be able to detect the difference). I am talking about HDMI cables, but same applies to component cables.