[Tech] The HD Conspiracy
17 years ago
General
In my time I've come across people with all sorts of opinions on high-definition. There are people like me, who can't get enough of it. There are people crazier than me who spend thousands on high-end display equipment and movie collections and PS3s just so they can appreciate every last pixel. Then there are people who can appreciate the extra clarity of HD but just don't feel the cost is justified, and finally there are the people who can't tell the difference (or say they can't) period, and as such aren't going to get HD stuff unless it's given away to them.
But below these groups on the scale of "HD appreciation" is another odd group, whose members I've bumped into at least twice. I'll call them the HD Conspiracy Theorists.
HD Conspiracy Theorists not only can't see the difference between SD and HD, but they doubt that anyone can. They think that HD exists purely so rich people can show off their wealth on something that's functionally worthless, like getting pure gold car doorhandles.
Has anyone else run into people who think this? As I said before in passing, I've met at least two people who were of the strong opinion that I was trying to show off when I mentioned that I was watching a movie in 1080p (I'd compare it to saying "I'm eating an icecream sundae and it's got a fucking maraschino cherry on top <3"). And they went on to tell me about how they couldn't, I couldn't, nobody could tell the difference. And even if we could, who needs to see more specks of dirt and hairs and pores and things of that nature, why can't we just appreciate the damn movie?
I for one like to see the pores and dirt. I need it to appreciate the movie. Also the elimination of fingernail-sized compression artifacts that you get when you move from 700 MB SD rips to 14000 MB HD rips is a sweet bonus.
But below these groups on the scale of "HD appreciation" is another odd group, whose members I've bumped into at least twice. I'll call them the HD Conspiracy Theorists.
HD Conspiracy Theorists not only can't see the difference between SD and HD, but they doubt that anyone can. They think that HD exists purely so rich people can show off their wealth on something that's functionally worthless, like getting pure gold car doorhandles.
Has anyone else run into people who think this? As I said before in passing, I've met at least two people who were of the strong opinion that I was trying to show off when I mentioned that I was watching a movie in 1080p (I'd compare it to saying "I'm eating an icecream sundae and it's got a fucking maraschino cherry on top <3"). And they went on to tell me about how they couldn't, I couldn't, nobody could tell the difference. And even if we could, who needs to see more specks of dirt and hairs and pores and things of that nature, why can't we just appreciate the damn movie?
I for one like to see the pores and dirt. I need it to appreciate the movie. Also the elimination of fingernail-sized compression artifacts that you get when you move from 700 MB SD rips to 14000 MB HD rips is a sweet bonus.
FA+

You can tell the difference; I for one though, don't really care either way...lol
As for the PS3, if I'm not mistaken, it can display more colors than the eye can see (oh, wait, I'm right, as the standard computer monitor can do that, but IIRC the PS3 displays powers of 2 more colors than that).
Anyway, where we may be differing is the difference between "see" and "distinguish." I will admit that the visible spectrum is infinitely divisible and you can assign as many bits to it as you want, but we can only TELL the DIFFERENCE between about 10 million of those colors (try it! Go into paint and scribble down some 0,0,0 and then change to 0,0,1 and scribble over it, repeat (0,0,2) until you can see the difference).
As far as distinguish goes, I dunno, I'm going off of what my professor says. It also depends on other attributes of the monitor, though: if you have a monitor which has 4 times the brightness, the brightness difference between all steps is also blown up by a factor of 4, in which case you definitely WILL be able to tell the difference. So just because you can't see it on one given monitor doesn't mean much--you can set up an example with a very dim monitor where you probably won't even notice 16-bit color, but you can sure as shit see more than 128 levels of gray. I'll try mucking around on my main display (500 cd/m^2) after classes today.
But yes, I need to have the brightness all the way up and then look at an oblique angle, which increases the contrast.
HD does many things, but most prominently, it allows you to see more details of more objects at once. So you can see a single enemy with 5 times the detail, or you can see 5 times as many enemies with the same level of detail. In that sense, it does sorta allow us to process more... though only because the resolution is the bottleneck right now.
XP No difference there...<..< Ya right...
I have actually met the polar opposite of what you experience. When I was in high school I bought a copy of Jasc's Paint Shop Pro. I told my computer teacher about it and he scoffed it off because I only paid $160 dollars instead of $1000 he did for a license of Adobe's Photoshop and therefore it is better just because. He never tried out Paint Shop Pro, as far as I know, but I tried out his copy of Adobe Photoshop and worked with my copy of Jasc's Paint Shop Pro and I didn't notice much difference between the two other than I liked working with Paint Shop Pro more than Photoshop. This relates to the HD conspiracy theorists because you have to wonder if the other person really thinks he is noticing the difference or is he just self-deluding themselves into thinking it is better because they put out a lot of money for it.
For reference, the human acuity limit is about 100 pixels per degree. So what you do is this: Measure the width of your thumb, call it X, then measure the distance from your eye to your thumb when your arm is stretched out, call it Y. Using Google or a calculator, do arctan(X/Y). Then take that number and tell Google "______ radians in degrees". In my case, my thumb is about 1.9 degrees. So I can see a max resolution of 190 pixels in the width spanned by my thumb.
Sit back where you view your TV or computer display and see how large of a visual angle it is, multiply by 100. Ideally, that's what its resolution would be--any more is wasted because you can't detect it.
while last time i was arguing with you about VGA cable over HDMI, it wasnt going on about "not being able to tell the difference".
My computer monitor which is essentially my "TV" also, was "High Def" at the time (720P).... and sadly im disappointed that i dont have the cash to get a better display. (I can walk into my local 'Tronics store.... and buy a pretty decent 1080P display for about 900).
I have bumped into some people who cant tell.... or care about the difference in HD displays.
I think they are blind or some shit.... but then again maybe some people are just poor sighted?
What do you think about sound?
There are even more people who cant appreciate the quality of SOUND that comes out of a movie. (I dont mean Surround sound.... i just mean pure sound QUALITY.... and the equipment its played out of).
Ive spent a lot of money on my computer's sound ($1500 on tall floorstanding speakers and digital decoding amps).
Similarly with my car.
Surround sound is a gimmick for people with the extra cash.... since i use my sound equipment for music..... which is 2 channel anyways.
http://www.newegg.com/Product/Produ.....82E16824001309
That's "1200p" and it's less than $400.
my current monitor is 32 INCHES..... to replace it.... i need the same size or bigger with better resolution than my current screen.
conventional computer monitors suck.
I think...
Watching a movie or playing a game, you notice the difference(and it looks awesome), but with TV, it's so crappy that it looks worse than SD whenever there's alot of action or movement.
if you buy a 50 inch TV..... dont sit 15 feet away.
My 32 inch monitor is NO MORE than 2 feet from my face. The bad part is that this display is only 720P.....
I buy size so i can FILL my vision with it.... not take up living room space.
until i have the money to invest in a SWEET projector.... a dome translucent screen.... and expensive software/hardware for changing the field of view, and mirror flipping the image onto a full periphery screen.......
ill just stick to my current setup.
I LIKE stuff that gets the job done but doesn't take up much space. If in the future my eyesight gets worse and I need to upgrade in order to actually see what's on the screen more clearly (like my parents...and that much I can understand), I very well may upgrade to a larger display...but right now, I can see just fine with my glasses, and the last time I saw my optometrist, he said there was no change in my vision, and expected it not to change much over the remainder of my life...
Sometimes, bigger isn't always better, sometimes it's just bigger...and more of a liability...
I find it ironic that you consider a projector to be cumbersome but not a 50" TV >_>
I'm blind past an arms length, so I really don't get any benefit unless I'm right on the screen.
...or will they? Is the brain still able to register color ranges to which it isn't exposed to? I feel like I should knwo this, given that I'm having a visual perception prelim on Thursday. Hmm...
The A/V manufacturers and studios are all trying to persuade you to give them your money, buying all new hardware and the HD versions all of the movies that you have in SD. The fact that their HD products can look and sound better than SD is just the bait on the hook. (Unfortunately, just because they can look better does not mean that they do. Just as there are plenty of SD transfers that are worse than VHS, it won't be long before there are plenty of bad HD transfers, too. Upscaling lousy source audio and video doesn't improve it much.)
Whether those companies will get what they want is another matter entirely, of course.
Ahah, but I'm a step ahead of them. I don't pay for my content, and even if I did, I'd never pay for an SD and HD version of the same thing--I'd pirate one or the other. I refuse to pay any sort of premium for it, beyond what's technically required (additional storage space on my computer, for example). Which brings me to another point: I only even have HD-compatible hardware because I do something which absolutely benefits from it, general computer usage and gaming. It wouldn't be worth hundreds or thousands of dollars to me if all I wanted was to get 1080p versions of movies.
I'm actually more optimistic than you. Well-done digital effects look better in HD, hands down, so anything that's animated (Ratatouille is a great example) will look phenomenal--because obviously, 3D graphics don't have a native resolution. And as movies are increasingly shot digitally from end-to-end, we'll just see the quality get better for new releases. Of course, that still leaves a huge number of films from the last few decades. I do have 1080p copies of things like Predator and Robocop, and the HD rips for those aren't -bad-... just that you can see all the film imperfections in HD too. Speaking of which, the film: I was told in my perception class that some company did a study back in the 1970s or 1980s about the effective resolution limit of the analog film that most theaters still use today, and it was found that they could be appropriately captured with an image ~4000 pixels wide. That corresponds pretty well with people talking about 4K raws and high-end film-grade HD cameras aiming at about that resolution, and it also indicates that today's 1920-pixel-wide format still hasn't hit the quality which can be captured on film. If those guys were right, then technically speaking, we should still be in the good range where there's actually more data to be captured and we're not just inventing values.
I will eventually get an HDTV, but I'll do it when I feel I can have it without concern and a sensible size that I can appreciate without taking up space to actually enjoy. Just as I was getting off from work, a person in Camo called out to me and said and I quote every single word "I need a Sony 37" HD LCD TV. Do you have any?"
One major thing I will say though, is that I think the maximum a video resolution should be is 800x600, IF it's not compressed. THAT's the key difference. Lossless files are WAY better than having compression artifacts of any sort in your file, and I can see how people would think "OH! It's because I'm looking at it in a higher resolution that I'm not seeing any squares!", but they're wrong. It's all with the quality of the file. It's like having a competition between who can make a louder sound and who can make a clearer sound. It's two totally different aspects of quality that 90% of EVERYONE doesn't get, and so they just justify it in the simplest way possible. The reason HD has taken off so much is because it is the first format and resolution change where the providers have started using the BEST compression methods or even NO compression methods. THAT's why HD looks so good, not because of the resolution, but because the video file you got was FINALLY of a good quality.
I can tell the difference between 720p and 1080p easily when it's on a native 1080p display from a decent viewing distance, myself. Especially during slower moments where you have close-ups of people and such. Hell, even Shoot 'Em Up, which was tons of action, looks glorious in 1080p.
Tch.
How soon they forget ;3
There are many animated movies and shorts which were created manually by painters before there were such things as computers. Many have been scanned only in SD or have not yet been scanned. They'll have to be re-scanned into a higher resolution digital format. I suspect the same is true for many older movies (available only on celluloid) which also were scanned before HD was envisioned. Sadly, celluloid disintigrates with time, so some have been lost forever and others can't be scanned again. :( (Fortunately, more recently made movies were filmed on mylar based films, which don't have that particular problem. They primarily suffer from color shifts. Or being lost.)
Although the underlying 3D graphics of modern animated movies don't have a native resolution, it is not at all obvious that most of that 3D will be recoverable in the future -- only the resulting renderings to a specific resolution (hopefully 4K) will be available. The obsolete proprietary software needed to translate and render the older proprietary 3D formats simply won't exist on future platforms. Although many studios use either 3DS Max or Maya, the major studios (like ILM) usually highly modify them or use their own sofware packages to create more effective results. I doubt that anyone would even consider rerendering one of the Final Fantasy movies, for example.
My understanding with regard to equivalent 4K quality 35mm film stock is the same as yours. Supposedly most digital effects are created with a 4K resolution, too. Unfortunately, many low budget movies were filmed on 16mm film, and some cinematographers intentionally film at low light levels or have the film processed to accentuate the grain. Even so, this means that there'll be at least one more generation of SHD (super-high-definition) 4k digital equipment. My understanding is that quite a few live-action movies have been recorded directly to 1080p, though, which may be a problem for them in the future SHD format. My personal guess is that it's likely to be about 5-10 years before any SHD format start being pushed. Maybe sooner if BD really doesn't generate the revenues they want.
I'm certainly not disagreeing with the premise that good HD can be great when it's been transferred through a fully HD chain of processing and viewing equipment. Whether or not its expense is justifiable is still an individual choice.
Bah, if those studios are clever, they'd have older builds of their software available. Of course, it's quite possible that some of their software is dependent on a particular hardware platform, in which case they can either try to reconstruct the platform, adapt the software, or try and use a combination of the film scan with the DVD video to come up with higher quality video (coming up with a semi/fully automated algorithm that can use lower-quality sampling to remove noise from higher-quality sampling, such as using an image thumbnail and a full-rez scan to eliminate paper grain while still keeping most of the crispness of the image, is something which has intrigued me, do you know if anyone has done work with this?). If they CAN adapt the software though, joyous day! Hardware has improved in speed by a factor of around 25x between 2001 (Final Fantasy: The Spirits Within) and now, if Moore's "law" is accurate over that time, which it generally seems to have been. So every month they spent rendering can now boil down to about a day, a year into a week. At worst it might take a month to re-render, although if they're doing it for a particular resolution which is lower than the 4K they did for film, it might take several times less again (1080p is roughly half of 4K I think).
I watched 28 Days Later not long ago, and that was in standard def for most of the movie--oddly enough it switches to high-def after the guy wakes up following a car crash towards the very end. Dunno why they did that really, but they could've been trying to degrade the quality of the film intentionally.
In any case, unless we switch entirely over to vector/parameter-based films (so everything becomes CG) or we pick a resolution which we think will be sufficient for the human perceptual experience from now until evar *snrk*, we'll always have this issue of catching up with previously high-quality stuff and having it become standard, then sub-par. Although 3D isn't a huge thing right now, if it does get big, we'll have the same problems--conventional films aren't usually in stereo. Same thing if virtual reality-style films get big, in which people can move around to some degree (motion parallax is a dead powerful depth cue), even the 3D film will be just beyond adapting. And Star Trek-style interactive film... well that's just beyond the realm of imagination in how one could adapt even a fully HD film, in any way.
But yeah. Things which revert to analog and traditional mediums can be tricky. Fortunately for the HD experience and less for the film history experience, I watch fewer of those with each passing year. It's kinda a self-resolving problem in that way; although people might occasionally go back to watch some older films (in which their love for the films will make them overlook any quality issues), the bulk of what they watch will be recent enough that it will either be shot in native/better resolution than their final viewing equipment supports, or on a medium (mylar film) which is still good for scanning.
Not only was the whole HD thing misleading, but unless you had some crazy expensive satellite package that included hundreds of channels you never watched or had the more expensive DVD HD player with some special edition DVD which wasn't the same as the rest of the mass produced DVDs of that movie, it was all worthless. Of course now a days this isn't the case.
Since most new TVs being sold these days have all the "converter boxes" and extra crap they never told you about already built into one package I can somewhat see it being justified, but you still need those extra HD channels from cable or satellite, which of course cost extra. Not every channel in these more expensive packages are broadcast in HD ether.
I guess I'm still rather jaded by the fact that the introduction of the whole HD thing was entirely set up to suck as much money from people as possible. I hate being lied too and being mislead is just another form of lying.
I just went straight for it on my PC and was knowledgeable enough that I didn't get tricked =P
One accessory to make it work, fine, whatever. 3,4 or even 5? What the fuck?
Though really, it should be expected that you'll need better video sources. You can't just generate quality out of nowhere!
Granted, I also can't see a reason why I would need game consoles, subscriptions to internet gaming services, or a computer capable of running Windows applications, either--there's nothing they'd give me that I'd really care about, let alone pay for. That's the kind of cheap and boring person I am :3
I believe that most people who can't see huge differences are looking at poor displays from long distances with poor video feeds. There's a massive difference.
My DVD purchases for the entire year amount to ten dollars plus shipping (I found a recording of a Westinghouse promotional video from 1943 and felt it necessary to show it to people), and this house doesn't have cable or satellite TV service--and I don't care, because there's nothing I want to watch, let alone pay for.
This computer doesn't even have a DVD drive--and I haven't bought one, even though I could get one quite easily via eBay, because I can't honestly think of a single DVD I need to put in there. As for the computer's display? Compared to what I had when I was younger, 1280X1024 is awesome, especially when I can and regularly do get such displays for less than ten dollars each at property auctions.
That in mind, how can I justify spending what is, at least for what I get paid, a shitload of money on expensive display technology, when I can't even think of a single thing I'd need it for? I'd be better off to buy a cubic foot of zinc. At least, then, I'd have a neat doorstop.
I'm not saying high-definition media aren't not better technology. They are, by any standard on which I might care to define better technology. Just that for me, I neither want it, nor can I afford it.
A DVD drive is useful for things like games and storing data for cheap, but if you don't play many modern games or have a lot of data to store and back up, that wouldn't be too useful to you I suppose.
All I was saying is that the differences aren't minor ;p I watched a Heroes episode in widescreen standard def on my 24" monitor from maybe 5 feet away, with my glasses off, and it was notably poor quality--and after about 4 feet, my vision starts to get blurry without my glasses. That's a pretty significant difference =p But if it's not worth it to you, it's not worth it to you.