What is the bandwidth requirement to stream a 1080p HD movie off a remote server?
Short answer: for 1080p24 (what you find on a Blu-ray movie of a Hollywood film),roughly 4-8 megabits per second, when using the fanciest compressed format.
For example, this "1080p" YouTube video (really 798 lines high because of scope, at) is at about 3.6 megabits/sec.
A Blu-ray film will have higher quality than YouTube, but it'll be encoded at more like 25-35 megabits/sec.
Long answer:
"1080p" is a bit ambiguous. Let's assume you're talking about 1080p24. This means that we get a progressive frame 24 times per second. The frame has 7.8-bit luma (brightness) at a resolution of 1920x1080, and 15.8-bit chroma (color information) at a resolution of 960x540. This is the format used for Hollywood movies (which are filmed at 24 frames per second) and most scripted prime-time TV and commercials.
Broadcast and digital-cable HDTV (called ATSC) can send this at about 16 megabits per second. If you watch an NBC prime-time broadcast, this is basically what you're getting. ATSC uses an older format called MPEG-2, which dates from about 1995, and is not as efficient as more recent formats.
A Blu-ray will often use a more efficient format called MPEG-4 part 10, or H.264. This is more computationally demanding but about twice as efficient, meaning you can generally get the same quality with half the bitrate. In practice a Blu-ray film will be at around 25-35 megabits/sec (because why not use the space if you have it?), but if throughput is at a premium, H.264 can go down into single digits of megabits/sec.
Neither broadcast HDTV nor Blu-ray supports 1080p60 (sixty progressive frames per second) -- 1080p60 is outside what MPEG-2 calls the "High Level" of conformance.
(And as a technical note: Broadcast HDTV does support "true" 1080p24, meaning the MPEG-2 headers are marked as outputting 24 progressive frames per second, but this isn't used in practice, in order to retain compatibility with the analog NTSC broadcasts at 60Hz. In practice, a signal from NBC will be 1080p24 in substance but formally 1080i60, in that the contents of the MPEG-2 transmission is indeed 24 progressive frames per second, sent progressively. BUT these frames are tagged with metadata (or really bits in the headers) instructing the decoder to interlace them and output them in a 3:2 pulldown to make a 1080i60 signal. Whether the decoder actually does this, or just displays the 24 progressive frames that are actually being transmitted, is an implementation detail that depends in part on whether the TV is really interlaced.)
(Technical note 2: In the U.S., all these rates are actually lower by 1000/1001. So "1080p24" is really 1080p23.976024...)
(Technical note 3: I bristle at using "1080p" to describe these formats when the way they work is by encoding the higher spatial frequencies with fewer and fewer bits, in other words with coarser and coarser steps. Boasting that your 4 Mbps MPEG-4 pt. 10 is still "1080p" is like boasting that your 128 kbps MP3 is "16-bit audio." Yeah, it was --- before you compressed it! Then you selectively quantized the different frequency bands at fewer than 16 bits. That's how you reduced the bitrate. In your 4 Mbps H.264, do you think
you still have 8 bits of resolution at the highest frequency component -- the one that makes it 1080p? No way. Do you think you have ANY bits? :-D)
3 Comments 7:24 on Fri Oct 01 2010
E.O. Stinson, Bioinformatics, casual cyclist, hobby...
4 votes by Lee Mallabone, Alex Kamil, Kevin Der, and Will Sheldon
This question may currently be too vague. Most video compression formats are lossy, which means that you can stream a 1080p resolution video at various levels of quality, where lower quality will be represented by degraded image/sound (blockiness, blurring, etc), but with the same number of pixels (which is really all 1080p defines). This is true even of over-the-air and cable broadcasts; some broadcasters try to squeeze more data down a smaller channel by compressing the data, even though it's still technically 1080p.
In Handbrake, where encoding rates are tuned for DVDs, the "regular" profile for H.264 1080p is about 1500kbps, and the "high quality" profile is 1800kbps. These are for video only. Audio adds additional bandwidth on the order of 128-256kbps (assuming that you're encoding as MP3 or AAC.)
For really high-quality 1080p, I often see bitrates as high as 3500kbps, including audio.
Translating into Mbps, you're looking at 1.4Mbps on the low end, and 3.4Mbps on the high end. For comparison, 802.11n wireless networks generally peak at 54Mbps, best case, while 802.11g networks peak at 22Mbps. (Note that with typical in-the-city interference, I've often had trouble streaming high-quality video over an 802.11g connection; the peak throughput requires pretty perfect conditions.)