[Tech] Oh '720p'...
18 years ago
Why do they call you 720p? I find your entire name to be misleading.
The first part, 720. I have numerous "720p" movies, but only a few of them are actually 720 pixels tall. The other are all odd heights in the 500s to 600s range. This is because, of course, many movies are released at wider-than-16:9 aspect ratios. Of my "720p" movies, their heights are 528, 532, 534, 536, 544, 688, and 720 pixels. Now, this isn't necessarily a bad thing--any sufficiently high-resolution video can be made into 720p by either cropping the top/bottom or left/right and scaling it, but while it might give you more picture data (fill the "full" 1280x720 pixels available) it would lose parts of the image which you are intended to see--thus, barring an absolute adoption of a theoretically ideal aspect ratio, the best you can really ask for is the native aspect ratio of the film, even if it means fewer actual pixels (anybody remember "full-screen" DVDs? Eew). But I digress. Point is, all the "720" really defines, in practice, is the maximum allowable height, which is a silly metric--I'd prefer minimums rather than maximums.
The second part, "p". This is silly for 720p because there's nothing else out there. There is no 720i. I mean, I guess you could MAKE a video file like that, but no devices are made to work with such things... 720p is always progressive. Perhaps they just kept the p because if you say 720, it's a number, if you say 720p, it's a tech term. Either way though, it's still silly, because it implies there's a non-p 720 out there.
I make the following suggestions. Rather than using stupid terms like "widescreen 480p" or "720p" or "1080i" or "1080p" or other such things, let's just keep it simple. We're coming into the era of widescreen (some would argue we've been there for years), let's define things based on screen width, which unlike height, remains constant for widescreen content. Let's get rid of interlaced stuff entirely, because interlacing is garbage, and the future is in stuff which DOESN'T draw lines one at a time using an electron gun. Thus we get this:
Anything below 480p becomes depricated
480p or "SD" becomes 640w
Widescreen 480p becomes 854w
720p becomes 1280w
1080i becomes Satan's bastard child of HD with interlacing
1080p becomes 1920w
2560x1440 "1440p" and 2560x1600 become 2560w (both support 16:9 and "wider", meaning less tall, aspect ratios)
3840x2400 becomes GODw despite having a sub-16:9 aspect ratio.
Sure, "nineteen-twenty-dubbal-yew" doesn't have the same ring as "ten-eighty-pee", but it's sure as shit more descriptive.
Also, let's get rid of this 24 and 23.97 and 25 fps nonsense. In Twile's world of New Media, all video refresh rates are multiples of 30, with 30 being the absolute minimum baseline, 60 being the ideal standard, and 90 being the peripheral-vision-tickling high-end which aims to push your visual acuity to the max. Sidenote: in Twile's world of New Media, everything must support at least 5.1 audio, even if some of the data is duplicated, just to ensure identical playback with the proper sound balance across all devices (my computer tries to strip the speech out of music and movies to only put it in the front audio channels while retaining other sound effects and music in the back channels--it's largely successful, but with some songs the resulting sound from the rear speakers is extremely processed-sounding and tinny).
Video is one of those things where we really need a full overhaul to be carried out. Two HD resolutions with three (720p, 1080i, 1080p) basic resolution/refresh style combinations. At least 3 (1.77:1, 1.85:1, 2.40:1) major movie aspect ratios in addition to older non-widescreen. Two major optical drives, totally incompatible, for distributing movies, as well as many more file types and codecs (H.264/DivX/XviD/WMV9 and MP3/MP4/Vorbis/WMA) used for internet "distribution". A whole plethora of audio standards combining buzzwords such as "lossless", "digital", "HD", "true", "plus", "surround", "live", and "pro" to name a few. No wonder this stuff scares average consumers, I can't even figure out who supports what and which sticker to look for on a box. Any idiot though can recognize what a 4:3 TV, DVD player and stereo sound all look/sound/feel like.
So yes. I wish there was a nice video revolution. I wish high-quality H.264-encoded DRM-free 5.1 AC3 or better versions of all movies were made available in "1920w" via an online store. I'd totally pay for a few downloads a month...
And a final sidenote (as if this whole journal isn't already one). My HD collection currently is comprised of 19 "1280w" movies spanning 125 GB of drive space, and another 20 GB of so for the first 19 episodes of Heroes in 720p, finally 25 GB more for all of Planet Earth in 720p. All this to be joined by 4 new movies over the next 22 hours, filling a further 25 GB of space. Some pretty nice stuff. Even more exciting is to think that "1920w" content has 125% more data/detail, the same relative jump that "1280w" had over widescreen SD video. It is essentially... the high definition to high definition.
Final sidenote P.S.: I'll stop using those 1280w and 1920w terms now because nobody else will understand them :b
The first part, 720. I have numerous "720p" movies, but only a few of them are actually 720 pixels tall. The other are all odd heights in the 500s to 600s range. This is because, of course, many movies are released at wider-than-16:9 aspect ratios. Of my "720p" movies, their heights are 528, 532, 534, 536, 544, 688, and 720 pixels. Now, this isn't necessarily a bad thing--any sufficiently high-resolution video can be made into 720p by either cropping the top/bottom or left/right and scaling it, but while it might give you more picture data (fill the "full" 1280x720 pixels available) it would lose parts of the image which you are intended to see--thus, barring an absolute adoption of a theoretically ideal aspect ratio, the best you can really ask for is the native aspect ratio of the film, even if it means fewer actual pixels (anybody remember "full-screen" DVDs? Eew). But I digress. Point is, all the "720" really defines, in practice, is the maximum allowable height, which is a silly metric--I'd prefer minimums rather than maximums.
The second part, "p". This is silly for 720p because there's nothing else out there. There is no 720i. I mean, I guess you could MAKE a video file like that, but no devices are made to work with such things... 720p is always progressive. Perhaps they just kept the p because if you say 720, it's a number, if you say 720p, it's a tech term. Either way though, it's still silly, because it implies there's a non-p 720 out there.
I make the following suggestions. Rather than using stupid terms like "widescreen 480p" or "720p" or "1080i" or "1080p" or other such things, let's just keep it simple. We're coming into the era of widescreen (some would argue we've been there for years), let's define things based on screen width, which unlike height, remains constant for widescreen content. Let's get rid of interlaced stuff entirely, because interlacing is garbage, and the future is in stuff which DOESN'T draw lines one at a time using an electron gun. Thus we get this:
Anything below 480p becomes depricated
480p or "SD" becomes 640w
Widescreen 480p becomes 854w
720p becomes 1280w
1080i becomes Satan's bastard child of HD with interlacing
1080p becomes 1920w
2560x1440 "1440p" and 2560x1600 become 2560w (both support 16:9 and "wider", meaning less tall, aspect ratios)
3840x2400 becomes GODw despite having a sub-16:9 aspect ratio.
Sure, "nineteen-twenty-dubbal-yew" doesn't have the same ring as "ten-eighty-pee", but it's sure as shit more descriptive.
Also, let's get rid of this 24 and 23.97 and 25 fps nonsense. In Twile's world of New Media, all video refresh rates are multiples of 30, with 30 being the absolute minimum baseline, 60 being the ideal standard, and 90 being the peripheral-vision-tickling high-end which aims to push your visual acuity to the max. Sidenote: in Twile's world of New Media, everything must support at least 5.1 audio, even if some of the data is duplicated, just to ensure identical playback with the proper sound balance across all devices (my computer tries to strip the speech out of music and movies to only put it in the front audio channels while retaining other sound effects and music in the back channels--it's largely successful, but with some songs the resulting sound from the rear speakers is extremely processed-sounding and tinny).
Video is one of those things where we really need a full overhaul to be carried out. Two HD resolutions with three (720p, 1080i, 1080p) basic resolution/refresh style combinations. At least 3 (1.77:1, 1.85:1, 2.40:1) major movie aspect ratios in addition to older non-widescreen. Two major optical drives, totally incompatible, for distributing movies, as well as many more file types and codecs (H.264/DivX/XviD/WMV9 and MP3/MP4/Vorbis/WMA) used for internet "distribution". A whole plethora of audio standards combining buzzwords such as "lossless", "digital", "HD", "true", "plus", "surround", "live", and "pro" to name a few. No wonder this stuff scares average consumers, I can't even figure out who supports what and which sticker to look for on a box. Any idiot though can recognize what a 4:3 TV, DVD player and stereo sound all look/sound/feel like.
So yes. I wish there was a nice video revolution. I wish high-quality H.264-encoded DRM-free 5.1 AC3 or better versions of all movies were made available in "1920w" via an online store. I'd totally pay for a few downloads a month...
And a final sidenote (as if this whole journal isn't already one). My HD collection currently is comprised of 19 "1280w" movies spanning 125 GB of drive space, and another 20 GB of so for the first 19 episodes of Heroes in 720p, finally 25 GB more for all of Planet Earth in 720p. All this to be joined by 4 new movies over the next 22 hours, filling a further 25 GB of space. Some pretty nice stuff. Even more exciting is to think that "1920w" content has 125% more data/detail, the same relative jump that "1280w" had over widescreen SD video. It is essentially... the high definition to high definition.
Final sidenote P.S.: I'll stop using those 1280w and 1920w terms now because nobody else will understand them :b
FA+

Unfortunately, people in marketing like to use terms to describe things as they are not, and when you mix technical terms with marketing, you get resolutions that aren't 720p claiming to be 720p. The "p" is redundant when you consider that there isn't a 720i out there, but it's placed there within the context of all modes, and there is interlacing in other modes.
It becomes even more confusing when TV producers claim their resolutions are 720p or 1080p. It seems most 1080p TV's are exactly 1920x1080, whereas 720p television have a wider range of resolutions. The latter are TV's capable of displaying 720p content, rather than TV's that specifically have 720p resolutions.
As for interlacing, 1080i can be displayed in very acceptable quality on a TV with half the scanlines of a 1080p TV. It also takes less bandwidth, making it easier for cable providers to transmit to the home. Considering how nice 1080i looks, I'm willing to take the interlacing for the improves picture quality over 1080i. I'm sure there will be a point in the future where 1080i will be deprecated in favor of 1080p, because bandwidth won't be an issue, but until then, it's going to stick for awhile.
If you're interested in this, see if you can find a copy of the ATSC standards document. That's where it says that 720p, 1080i, and 1080p are very specific resolutions.
The thing is that if those terms describe things other than video with 720 or 1080 lines, they're misleading, and if they do only describe video with 720 or 1080 lines, they're not particularly useful, and other video 1280 or 1920 pixels wide doesn't really have a name then.
I'm against interlacing because it's just... blah. I'm a quality freak. The computer industry has been for a long time fully switched over to progressive, game consoles and optical disc players are progressive-scan, and as people continue to switch over to LCD, DLP and Plasma TVs, the presence of interlacing just means you get half the image for half the bandwidth, though with the added expense and quality drop of having to deinterlace.
http://www.hometheatermag.com/gearworks/1106gear/
http://blog.hometheatermag.com/geof.....061080iv1080p/
"I will not waste time proposing name changes for things unless I'm actually in the society that determines the names."
That's just bad on so many levels :P
It's only obvious to someone like you, who
A- Has decided interlacing no longer has any purpose
B- Thinks that measuring width is somehow more 'obvious' than hight
C- Decides that anything older than 480p is 'deprecated' (which it is /not/)
D- Doesn't know where the old video refresh rates came from
E- Doesn't feel that stereo is sufficient
Since I only fit on C and D, I think it's fair to say that 'obvious' to you may look 'retarded' to me, and as neither of us is on the committee making the names, I don't think arguing about it is particularly productive. :P
Of course, if you wanted to, you could write up the 'Twile Standard' and release it.
B) Width is not any more of an obvious metric than height. However, between all the "720p-like" videos I have, which are basically all things wider than SD widescreen but smaller than or equal to 1280x720 pixels, the only constant is the width of 1280 pixels. As I thought I made clear, due to the variety of aspect ratios used in the film and TV industries, very few things marked or thought of as having 1280x720 resolutions actually have that and exactly that. 1280 thus tends to be a constant measurement for these resolutions, whereas 720 is a rarely-achieved upper-bound. Consider the dilemma that occurs when you want to discuss a 1920x1200 (24 inch) computer display, or a 1280x800 laptop display. Calling them 720p or 1080p isn't entirely accurate because they have more lines, so it's somewhat selling them short. But calling them 800p or 1200p is also silly because then we'd have tons of measurements we'd have to add... 768p, 1024p, and 1050p to name a few. Or you can always refer to them by their actual resolutions, but that's tedious when you just want to make a broad generalization about the display capabilities of a class of devices. WHAT'S A PERSON TO DO? I already explained that. Use width as the measurement, because it tends to be more constant. Frankly, if I had things my way, we'd have one fucking aspect ratio for all video content and devices, to eliminate the foolish practices of letterboxing and pillarboxing. Then, you're right, either measurement would be just as 'obvious', and only one would be needed to determine the full resolution (whereas today, 720p content can be played at native 1:1 pixel ratio on 1280x720, 1280x800, and 1280x1024 resolutions, ad 1920x1080 or 1920x1200 can correspond similarly to 1080i/p). Consider the following explanations to Joe Everyman: "It's called 1280w because it's 1280 dots wide" versus "It's called 720p because it can be up to but sometimes less than 720 dots tall, and just ignore the p because it doesn't mean anything in this case".
C) Video below 640x480 should be eliminated as much as possible. Games, movies, and visually-oriented TV shows benefit heavily from higher resolutions, this I shouldn't even have to argue. The only case where sub-SD resolutions can plausibly be argued to suffice is when you're trying to convey an idea or information, such as with the news or those "wow, I can't believe they did that" YouTube videos, and sometimes fan-made music videos that just need to bring the scenes into your mind rather than show you every last detail.
D) What do I not know about video refresh rates?
E) Stereo can be sufficient for the cases I described in C, but to audio. If the purpose of the audio is to convey information, such as the news, then it's okay. If the point is for entertainment, i.e. any case where you'd consider using >192 kbps audio, I can't see why stereo should be sufficient. The fact is that, more so than ever, people are able to get affordable surround-sound for computers and home theater systems and videogames, and if they have to do a software-based extrapolation of what came from where to try and create new channels, results will be mixed, and not always very good. Except in cases where bandwidth is precious, I don't see why file sizes can't be a little bit larger to carry a few more audio channels and satisfy everyone.
I'm sorry if my thoughts look retarded to you. You do always have the option to not, you know, read my journals. If you think it's a waste of my time to ponder over things I can't change, it's surprising that you bother to read and comment on them.
2 one aspect ratio would be nice, but it isn't happening
3 640x480 still exists, we can't remake old TV shows into 1080p
4 Something I don't either, namely -- where the crazy numbers come from
5 Personally, stereo is sufficient, I have never heard any difference when using 5.1 or 7.1...
As for looking retarded, my whole point is that what is 'obvious' to you is not 'obvious' to others. The system you proposed didn't look 'retarded' but it did look a bit silly :P
As for not reading your rants, I personally /enjoy/ arguing. :)
2. I know we aren't going to get one aspect ratio. Hence why I said, if I had things my way.
3. I know that 640x480 video is still out there. What about it? Its use should be discouraged when possible, opting to record things at higher quality levels and broadcast when it's feasible. That's all I'm saying about that.
4. I don't know what crazy numbers you're referring to with respect to refresh rates.
5. You unfortunately must not have been using a very good setup or media. When listening to HD movies with 5.1 sound, I get to enjoy such effects as hearing explosions and gunfire off-screen, and also hearing things moving with a greater sense of depth. Things moving towards the viewer can continue to make noise (speech, engine sounds, rockets) when they go off of the screen, the sound slowly shifting from front-only to rear-only. As far as 2-channel audio goes, having a 5.1 setup allows me to be "bathed" in audio. The difference when I flick the audio input type switch from 5.1 to 2-channel is quite noticeable.
"Also, let's get rid of this 24 and 23.97 and 25 fps nonsense. In Twile's world of New Media, all video refresh rates are multiples of 30, with 30 being the absolute minimum baseline, 60 being the ideal standard, and 90 being the peripheral-vision-tickling high-end which aims to push your visual acuity to the max. Sidenote: in Twile's world of New Media, everything must support at least 5.1 audio, even if some of the data is duplicated, just to ensure identical playback with the proper sound balance across all devices (my computer tries to strip the speech out of music and movies to only put it in the front audio channels while retaining other sound effects and music in the back channels--it's largely successful, but with some songs the resulting sound from the rear speakers is extremely processed-sounding and tinny)."
Now, this might all be possible just because not everyone is doing it, but the more people who buy and make full use of broadband, the faster the tech companies will be motivated to switch on fiber optics and give us absolutely ludicrous transfer rates.
I'll not even try to predict the state of things "many decades away". It was only about 5 decades ago that the room-sized, dozen-operation-per-second electronic computer was first built, 2-3 decades ago that it became possible to own one, far less than a decade ago that it became practical to own a laptop. Calculations gone from dozens per second in the equivalent of super computers to half a trillion per second in a graphics card, memory has gone from being made by the byte to being made by the gigabyte, storage has gone from 5 MB in an expensive appliance to 160 GB in a pocket-sized consumer-grade music player, and most relevantly, internet transfer speeds have gone in 20 years from several KB/sec at best over phone lines to hundreds or thousands of KB/sec in each direction, simultaneously, with the potential for orders of magnitude improvement if internet speeds catch up to LAN speeds, and many more if they toss in optical fiber.
Unless somebody STOPS the 'net being a viable system for TV broadcast, I'm going to say it's already here and only becoming more prevalent as time goes on.
As for speeds...
The US has 1.97 Mb/s
Japan has 61 Mb/s
ROK has 45 Mb/s
We are pitifully behind. You need ~ 6 Mb/s for most streaming systems now in development.
Part of the problem?
\"The Federal Communications Commission, which has broad sway over the emerging broadband market, defines \"high speed\" as 200 kilobits per second. The benchmark was adopted more than a dozen years ago when still-slower dial-up was the rule. Cohen says 200 kilobits is not even recognized as broadband in most countries today. \"There is nothing speedy about it.\"\"Main Source - http://www.usatoday.com/tech/news/t.....t-speeds_N.htm
First of all, it's possible to make products that have BitTorrent integrated and hidden away from the user. I got some large file from Blizzard one time (think it might've been a demo or open beta or something) and it clearly utilized BitTorrent to establish the connections and save their bandwidth. Their FAQ for the downloader app was almost identical to the one BitTorrent had at the time, addressing the same issues in the same ways, leading me to believe they incorporated it into their software. Is it too hard to believe a major corporation developing their TV channel software which sends proprietary, DRM-protected files via an integrated BitTorrent component?
Second, the problems with assuming you need ~6 Mbps for streaming video are as such:
1. Commonplace TV show downloads at reasonable quality are 175 MB per ~22 minute file, which is barely over 1 Mbps to stream that content all day long. Hell, 6 Mbps is good enough for 720p content (which is typically 5 or 10 Mbps in downloaded trailers or ripped movies).
2. You're assuming stuff has to be streaming, which I would hardly consider to be an advantage over standard TV content. Ideally you would subscribe to the shows you like and have those delivered to your computer as soon as they're available, then you watch them at your leisure (and possibly the software auto-deletes them after viewing). Assume that you like to watch 2 hours of TV a day, with commercials included. That works out, with standard 175 MB / 22 minute episodes, to 956 MB every 24 hours. That's like 90 Kbps, which is way below the threshold for the antiquated definition of "high speed". And you can cut that in half without much trouble if you use more advanced codecs such as H.264 for your content encoding.
There are really two ways to make TV via 'net a viable and preferable content delivery system. You can make it so files can be shared however people want, but are proprietary so they can only be played back in an app which doesn't allow you to fast-forward, and you have included commercials. Or, you make it so you have to pay to download the files and can't share them with other people (DRM protection) and make them commercial-free. In either case a system can be imagined where you download first and view whenever you want, or view the content streaming. Unless you insist on overly inflated DVD video being streamed, it's totally feasible with today's bitrates. And maybe it's not available to people with low-end "high speed" internet or dialup... so what? Nobody says you have to discontinue TV entirely. But if they want to be able to watch TV shows on demand and/or commercial-free, they'll just have one more reason to get a good 'net connection.
The Video and Music industries /hate/ BitTorrent, asking them to use it, or something like it, is viewed as an instant 'no-go'
That having been said, I don't think it's safe to say that things are or will stay this way. Any hatred the entertainment industry has for BitTorrent is based on ignorance of what it really means. BitTorrent does not have to go hand-in-hand with people who distribute recorded TV shows, just as VHS didn't have to go hand-in-hand with recording and copying TV shows and movies. With VHS, movie and TV studios found a new way to sell their content for personal playback at any time--a legacy which has been passed on to the superior DVD. There's no reason to think this can't happen to BitTorrent... although initially it seems like a frightening platform which is associated in some way with stealing content, companies will eventually realize that they can make a profit off of this, and profit motivates all.
People such as myself are willing to pay to have content commercial-free, on-demand, and portable between computers, media players and home theater systems. Rather than waiting until 6 months after the end of a season to get a boost from DVD sales, they can have their future costs being covered literally the day the content airs... they can put out as many shows as they want at a time, having effectively unlimited "prime time" slots... they can get direct feedback, as content is released, about exactly how many people are watching a show and how highly they rate it, etc etc etc. There are so many advantages for them and for us that to think they won't pick up on it is absolutely silly. Hell, they're already doing it with things like iTunes--they, like the rest of the media industry, just need to learn the following lesson and then all will be good:
Threat of piracy aside, you need to open up the content to be DRM-free and playable on any devices by anyone. Some people who are broke (thus aren't really depriving you of revenue) will pirate and view the stuff, but that can't be helped. It'll win over many people who presently pirate stuff because they don't like to watch stuff live on TV, or pay for TiVo, or record it themselves, etc etc... these people will be bought over by the offer of affordable, high-quality content that they can enjoy anywhere, any way they want, and knowing they won't get sued. The movement towards pure MP3s being sold is evidence of this working.
The internet yo... it's gonna be the best way to watch TV...
I /want/ what you want, I just realize that it's not going to happen anytime soon...
Much like 3D Cat scans with combined with Fourier transforms allowing a national database of high resolution images of everyone's body...
It would be nice, and we have the tech... but it's still not going to happen for 10 more years... (we had the tech for it 10 years ago...)
That, combined with TV shows on iTunes and Xbox Live, shows that at least part of the entertainment industry is willing to take TV content and distribute it free and streaming with commercials, or paid to download without commercials, over the 'net. It's not a total switch, but this stuff takes time. All we can really ask for is for them to change more content over to the new model and pull out the DRM, but again, takes time.
Film is projected at 24 frames per second but using a shutter that makes it flicker at 48 fps or 72 fps -- because most people can't see that high a flicker rate in a darkened environment like a theater. So video made from film has to insert an additional field (not frame!) every so often to increase it to 29.97 frames/sec for display on a traditional analog TV. And when you play it back on your computer, your graphics hardware also has to muck with the timing and insert frames to increase it to whatever scanrate you have your graphics card generating.
Film based DVD content actually is only 24 frames/sec. Standalone players include the hardware to do the frame-rate conversion to drive the TV -- which is different for NTSC, PAL or SECAM. Video based DVD content is at whatever frame rate was approprate for the country where it was recorded.
Most movies are still made on 35mm film -- which is then scanned at either 2K or 4K resolution: equivalent to or higher than HD. Most CGI special effects are done at 4K. And then it's put back onto 35mm film for distribution to traditional theaters. Most digital theaters are still at only 2K. A very few have 4K. (2K and 4K refer to the horizontal resolution, just as the furry red dragon has suggested. Vertical resolution varies depending on how wide-screen it's supposed to be.)
Much of this is described on Wikipedia, of course.
And seeing as the class was part of the DIGITAL Media program, and we used a digital format for all of our work, he does know the history of both.
Wikipedia states that it is because of the airwaves and the ephemeral ether through which TV was broadcast:
"Color information was added to the black-and-white image by adding a color subcarrier of 4.5 x 455/572 MHz (approximately 3.58 MHz) to the video signal. In order to minimise interference between the chrominance signal and FM sound carrier, the addition of the color subcarrier also required a slight reduction of the frame rate from 30 frames per second to 30/1.001 (very close to 29.97) frames per second, and changing the line frequency from 15,750Hz to 15,734.26Hz."