[FYI] HDR, 30+ bit color, why it matters
17 years ago
So in my last journal, I got in a small side-discussion with someone over how many colors and things the human eye can perceive, and how it relates to displays and movies and games and the like. Given that "HDR" is a kind of current buzz word, and a lot of people say "I can't tell the difference between white and the faintest gray on my monitor, we have enough colors" I feel that I should clear things up from a perceptual perspective. There area number of factors in play here that you need to be aware of.
First, let's consider the dynamic range of the human eye. The brightest thing you can look at without damaging your eyes is on the order of 10^7 candelas per meter squared (10 times as bright as the filament of a 100-watt lightbulb) and the dimmest thing you can see is 10^-6 candelas per meter squared (as bright as a black piece of paper illuminated by only starlight, no moon). That means the brightest thing you can look at versus the darkest thing you can see vary by a factor of 10^13, or 10 trillion.
Of course, that's not to say that your eye or brain can resolve the difference between 1000000 photons and 1000001 photons. The amount of contrast you can resolve varies based on your age, personal factors, the level of illumination, and the distance over which it varies. For a 20-year-old, a 200:1 or 300:1 contrast ratio seems to be pretty good. My perception professor says that 1000:1 is about the limit, which disagrees with some of the graphs I'm seeing, but who knows. Some people can see much better or worse than this. A piece of white paper with text on it has a ~30:1 contrast ratio, which is about as good as you can get with even illumination over a scene.
Let's compare this to a monitor--we'll use mine and call it typical. It advertises 500 cd/m^2 in brightness and a contrast ratio of 1000:1. Assuming that full brightness is 500 cd/m^2 that would put minimum brightness at .5 cd/m^2. But if you'll recall, the human eye can see values from .000001 to 10,000,000 cd/m^2, meaning that this display could both be improved by showing brighter brights and darker darks. Even in a setting which won't require your visual system to spend 30 minutes dark-adapting, things range from 1 cd/m^2 to 10 million cd/m^2, so there's room for improvement.
But let's get back to contrast. If you can tell the difference between something which is 1/1000th brighter than something else, and a monitor only has 256 levels of lightness, it's clear that it has only about 1/4th of the number of steps it should need to span that range. So only storing colors in 256 brightness values isn't enough. Let's pretend it is, though--let's say a person CAN only see something with 1/256th the difference. Why would we want to store any more values than that?
Several reasons. First, consider taking a picture inside of a church: the stain glass windows will be letting a lot of bright sunlight through, but the corners in the top of the roof, and any architectural details up there, will be in a great deal of shadow. If we only have 256 values to go between a really bright color and a very dark black, we have basically three choices: we focus on the dark details so we can see them and anything above a certain threshold just becomes white (a long exposure), we focus on the light details so we can see them and anything below a certain threshold just becomes black (a short exposure), or we try to map the darkest dark to black and the lightest light to white and we end up with very visible borders between brightness levels. Were we actually there, we could let our eyes adapt to the light or the dark regions, revealing either the fine details in the dark corners, or the fine details in the stain glass windows.
And in fact, for screens with a very high dynamic range and higher bit-depth images, that's exactly what can happen: you can look at a part of an image with a large contrast range, and your eyes will literally adjust and adapt to the brightness range you're looking at to bring it into clearer focus.
What's another reason you might want something--say, a photograph, or a texture for a game--to have more than just 256 levels of brightness? Well, usually with these things, you can do some degree of editing to them. Say we took a picture of that church before. If we used a large bit-depth, we could crop to just the dark corner of the ceiling and turn up the brightness (like our eyes adapting) to see fine details. If we only have 256 levels of brightness, all the dark values will probably be within a couple of each other, hiding fine detail. The same idea holds for a texture in a game, where lighting effects can be applied.
So what you'll sometimes see in games these days is advertising or options for HDR, and that typically means that they're doing tricks to make the lighting look more like what you would see in real life. Obviously if you're only using 24-bit color in a game and your monitor only supports 24-bit color, you can't truly get brighter whites and darker blacks, but keeping track of an extended range of colors with better precision lets lighting effects and calculations be carried out in a more realistic manner, without just cropping values when they're too high or too low. The game can, of course, choose to do certain cool things with this, such as light bloom (anything brighter than white floods out a little into neighboring regions), or simulating the change of pupil size, which Crysis and Half Life 2 do (so when you go into a darker area, the details come into focus, and light areas become washed out--and when you go into a bright area, things are initially bright and washed out, before settling back down to normal levels).
As display technology gets better and better, with contrast ratios in the ten-thousands-to-one range and with ever-brighter backlighting, these issues will become more apparent. Indeed, with a good bright display, if you try to use the full range of brightness values with a standard 24-bit color photograph, you CAN notice the steps between colors, and it can actually look better to tell the display to dim itself down, oddly enough.
If any of this stuff interested you at all I suggest you take a look at this Wiki article on a new JPEG format, which supports better color depths--I believe it even links to discussions which say some of this stuff better than I can.
Just feel that I should keep you guys in the know, and I'm learning this stuff, so why not share it? =D
First, let's consider the dynamic range of the human eye. The brightest thing you can look at without damaging your eyes is on the order of 10^7 candelas per meter squared (10 times as bright as the filament of a 100-watt lightbulb) and the dimmest thing you can see is 10^-6 candelas per meter squared (as bright as a black piece of paper illuminated by only starlight, no moon). That means the brightest thing you can look at versus the darkest thing you can see vary by a factor of 10^13, or 10 trillion.
Of course, that's not to say that your eye or brain can resolve the difference between 1000000 photons and 1000001 photons. The amount of contrast you can resolve varies based on your age, personal factors, the level of illumination, and the distance over which it varies. For a 20-year-old, a 200:1 or 300:1 contrast ratio seems to be pretty good. My perception professor says that 1000:1 is about the limit, which disagrees with some of the graphs I'm seeing, but who knows. Some people can see much better or worse than this. A piece of white paper with text on it has a ~30:1 contrast ratio, which is about as good as you can get with even illumination over a scene.
Let's compare this to a monitor--we'll use mine and call it typical. It advertises 500 cd/m^2 in brightness and a contrast ratio of 1000:1. Assuming that full brightness is 500 cd/m^2 that would put minimum brightness at .5 cd/m^2. But if you'll recall, the human eye can see values from .000001 to 10,000,000 cd/m^2, meaning that this display could both be improved by showing brighter brights and darker darks. Even in a setting which won't require your visual system to spend 30 minutes dark-adapting, things range from 1 cd/m^2 to 10 million cd/m^2, so there's room for improvement.
But let's get back to contrast. If you can tell the difference between something which is 1/1000th brighter than something else, and a monitor only has 256 levels of lightness, it's clear that it has only about 1/4th of the number of steps it should need to span that range. So only storing colors in 256 brightness values isn't enough. Let's pretend it is, though--let's say a person CAN only see something with 1/256th the difference. Why would we want to store any more values than that?
Several reasons. First, consider taking a picture inside of a church: the stain glass windows will be letting a lot of bright sunlight through, but the corners in the top of the roof, and any architectural details up there, will be in a great deal of shadow. If we only have 256 values to go between a really bright color and a very dark black, we have basically three choices: we focus on the dark details so we can see them and anything above a certain threshold just becomes white (a long exposure), we focus on the light details so we can see them and anything below a certain threshold just becomes black (a short exposure), or we try to map the darkest dark to black and the lightest light to white and we end up with very visible borders between brightness levels. Were we actually there, we could let our eyes adapt to the light or the dark regions, revealing either the fine details in the dark corners, or the fine details in the stain glass windows.
And in fact, for screens with a very high dynamic range and higher bit-depth images, that's exactly what can happen: you can look at a part of an image with a large contrast range, and your eyes will literally adjust and adapt to the brightness range you're looking at to bring it into clearer focus.
What's another reason you might want something--say, a photograph, or a texture for a game--to have more than just 256 levels of brightness? Well, usually with these things, you can do some degree of editing to them. Say we took a picture of that church before. If we used a large bit-depth, we could crop to just the dark corner of the ceiling and turn up the brightness (like our eyes adapting) to see fine details. If we only have 256 levels of brightness, all the dark values will probably be within a couple of each other, hiding fine detail. The same idea holds for a texture in a game, where lighting effects can be applied.
So what you'll sometimes see in games these days is advertising or options for HDR, and that typically means that they're doing tricks to make the lighting look more like what you would see in real life. Obviously if you're only using 24-bit color in a game and your monitor only supports 24-bit color, you can't truly get brighter whites and darker blacks, but keeping track of an extended range of colors with better precision lets lighting effects and calculations be carried out in a more realistic manner, without just cropping values when they're too high or too low. The game can, of course, choose to do certain cool things with this, such as light bloom (anything brighter than white floods out a little into neighboring regions), or simulating the change of pupil size, which Crysis and Half Life 2 do (so when you go into a darker area, the details come into focus, and light areas become washed out--and when you go into a bright area, things are initially bright and washed out, before settling back down to normal levels).
As display technology gets better and better, with contrast ratios in the ten-thousands-to-one range and with ever-brighter backlighting, these issues will become more apparent. Indeed, with a good bright display, if you try to use the full range of brightness values with a standard 24-bit color photograph, you CAN notice the steps between colors, and it can actually look better to tell the display to dim itself down, oddly enough.
If any of this stuff interested you at all I suggest you take a look at this Wiki article on a new JPEG format, which supports better color depths--I believe it even links to discussions which say some of this stuff better than I can.
Just feel that I should keep you guys in the know, and I'm learning this stuff, so why not share it? =D
FA+

Good read all the same, learned a few things too... Where did you get your facts from ? :D
I got the facts from my textbook, my Visual Perception class lecture notes, and my personal experience.
NOW YOU KNOW!
hopefully you will continue to post this kind of stuff...
while i knew this about HDR and color depth in a pretty general way.... this cleared up a lot for me.
As for the jpg format, there's already an image format out there that contains....4 f-stops worth of data in them. I've forgotten what it's called though. *wikibrowsing* IMA maybe? It's supposed to store color information four times over in different contrasts (f-stops or aperture sizes). Widely used for rendering output that can then be edited later to change the contrast, IIRC. Not that we couldn't use another one.
Don't act like good graphics and gameplay are in exclusion. The only two games I own which do the brightness-adjusting HDR thing are Half Life 2 episode 2 and Crysis, both of which do quite well in the gameplay departments and look pretty slick on top of that.
I dunno the pros and cons of your IMA thing. But JPEG HD also benefits from improved compression (2-2.5x) at a given quality, or improved quality at a given size, and has support for a bunch of nifty things. Anyway, bed time, so now I sleep.
As for Crysis: people keep going "OMG realistic *spooge*" and I keep going "The water looks faked, the way the plants look looks less than ideal, the..." as I feel unnerved by its "realness." (If you know anything about ShadowRun you'd know that people feel quite unnerved by entering an Ultraviolet level system node because "it's indistinguishable from real life.")
The question of exactly where and when the uncanny valley stuff kicks in is one I haven't seen a good answer for. I mean, I just watched Beowulf, and when I was focusing on the story and fights and dialog rather than scrutinizing every visual detail, I basically forgot I was watching a CG movie. Are they before or after the uncanny valley, or perhaps they're in it and I'm just defective? I dunno. I'm a proponent of being ABLE to do stuff very realistically--you can always use the extra processing power to make something less realistic but stylized (Ratatouille).
But hey, that's just my opinion on the matter.
I do understand your point though. Yes it "just happens" but there are some ideas that you can see them being great (like how many hardcore gamers saw Spore) and make them terrible by not focusing on gameplay.
The way you turn money into "gameplay" is by playing the game and getting more people to play the game over and over. A good game is one where you can play through a second time and get more out of the game than the first time through. Ex. Portal, Portal is a nifty-graphics and more-levels version of Narbacular Drop by Nuclear Monkey Software, while at the same time featuring an actual story and improved game play.
The game should be challenging (you should lose on occasion and it should mean something: ex. Spore: Death is a slap on the wrist and means little, in the space phase it's actually less annoying TO DIE than it is to travel a single wormhole...because you can skip the death cutscene but not the wormholeyness one).
Winning should give the player a sense of satisfaction of having achieved something (Spore: getting to the core is a huge disappointment). Sandbox games need to have large complex simulation algorithms behind the scenes that the player can toy with (Spore has 7 (3 of which do the exact opposite of another 3) or so terraforming tools and 130 "stamps" that they have little control over) and create things because of the way the algorithms combine to create a larger, more complex algorithm. Making something interesting is the "win" for sandboxes (if the Creature Creator (or other editors) counts, then it is a game by itself, if it is not then Spore is not a sandbox--I have been told Spore is a sandbox game because of the content creation editors; it WOULD BE if those creatures INTERACTED WITH EACH OTHER ON A NON-SAPIENT LEVEL and if sapient creatures could be placed on a new planet and colonize it without needing to be space faring).
Yeah, I know about Portal. And I'm wondering whether they got the game to be better because they hired OMG massive numbers of testers and smart people, or because they just have some good people at Valve already. In either case, the game's core gameplay element of moving through a maze/puzzle space by making portals between areas came from people who weren't getting paid to come up with ideas. It just happened. And might I point out that Portal also has decent visuals and things like motion blur and high dynamic range and all that stuff, despite the fact that it's lots of fun. You make or buy a graphics engine with advanced features, and that's THAT, you can then focus on making all sorts of fun games that will run on it.
Let me just sum up my points now. I think it's absurd that I should pay $50 or more for a game which isn't excellent in many respects. It's stupid that I should be able to get a game with good single player, good multiplayer, nice graphics, realistic physics, fun gameplay, and a good amount of content. It'd be like if I went to a movie and it was a cheap knockoff of Lord of the Rings with unintentionally shitty special effects and dialog, and it was half as long, for the same price. I have yet to see a good reason why any company should be excused for doing some aspect of a game poorly when others can do an excellent job all-around, given that I'm paying the same amount. I'm not going to excuse poor gameplay, or poor graphics. I want them to both be very good to excellent for me to consider buying their product.
Now, as for Portal, they didn't make a new engine for it, they did it using the existing HL2 engine, so less time was spent on the graphics and was instead spent improving the puzzles and the physics engine (listen to the dev commentary about how they hacked that one together!).
OTOH, look at TF2, those aren't HD graphics by any standards, in fact, they're quite simplistic, but they do the job better than super-real graphics would due to the inconicness of it. While a style choice, it does make quick game play easy. No more "is this empty beer bottle a plot important item? What do I do with it?" as the only intractable objects separate themselves from the background cleanly.
(As a side note: why do all FPS games, or most of them, take place in a post-apocalyptic world with GARBAGE EVERYWHERE? Alternatively a war zone with exploded building accomplishing much the same effect).
Yes, Portal did use an existing graphics and physics engine with some modifications--but so what? I didn't say that you had to put together an entirely new game engine for every title, in fact, if companies did that, although there would be more variety in how games look, there would be fewer games, they'd cost more, and they'd look worse! Portal managed to keep a level of graphics which is absolutely acceptable, especially when you realize that it's not focusing on graphics at all.
TF2 has great graphics, I've gotta disagree with you. The game looks gorgeous in 1080p with everything cranked up. There's a vibrancy in all the colors, a certain softness, some light bloom and motion blur, and pleasantly rounded and cartoony visuals which combine in a way which is quite appealing to the eye. Although they didn't make the game the most OMGrealistic-looking title ever, BY NO MEANS did they ignore its visuals. Let's not forget the "modern" visual effects that they do use--motion blur, soft particles, Phong shading, and in fact, high dynamic range.
I'm also not saying that every game needs its own engine, I'm saying that less work was needed because the engine already existed. My point was that they didn't need to spend money producing high quality graphics in order to have them (they already existed) and instead they were able to focus on gameplay and the physics of the portals (getting a cube to collide with itself is funny as hell). Yes, money was spent producing those graphics, but that cost was mostly associated with another game
I don't really know where any of this is going, to be frank. I don't think there's anything you can say to convince me that any aspect of a game should be less than amazing =P
Don't act like good graphics and gameplay are in exclusion.
As the budget for developing a game increases the less likely the upper management will approve of riskier gameplay elements being put into the game. What you end up with is a game that doesn't stray too far off the treaded path and provides safe, enjoyable gameplay that will be forgotten. This is a problem that plagued platformers in the 80s, 90s and a problem that is plaguing first person shooters today.
So what you'll sometimes see in games these days is advertising or options for HDR, and that typically means that they're doing tricks to make the lighting look more like what you would see in real life. Obviously if you're only using 24-bit color in a game and your monitor only supports 24-bit color, you can't truly get brighter whites and darker blacks, but keeping track of an extended range of colors with better precision lets lighting effects and calculations be carried out in a more realistic manner, without just cropping values when they're too high or too low. The game can, of course, choose to do certain cool things with this, such as light bloom (anything brighter than white floods out a little into neighboring regions), or simulating the change of pupil size, which Crysis and Half Life 2 do (so when you go into a darker area, the details come into focus, and light areas become washed out--and when you go into a bright area, things are initially bright and washed out, before settling back down to normal levels).
The graphics engine in a game doesn't care about what the colour bit-depth the monitor can handle. Whatever colour it puts out is just going to be rounded to the nearest available colour that the monitor can display.
Sure it cares. If it knows that your monitor can only handle 24 bits of color, and it's storing much more precision than that, at each frame it can ask "What value should I call white, and what value should I call black?" and reassign them at will. A boring graphics engine will just pick two values and stick to them. A better engine will consider things like overall scene brightness and special effects so that it can do the HDR effects I described before--shift the high and low down towards blacks and it simulates adapting to a darker indoor environment, shift it towards white and it's like getting adapted to an outdoor environment. And as I said before, light bloom can also be managed if you know what's bright versus what's BRIGHT.
Although not part of the official standard for Blu-ray DVDs, some players now include support for "deep color" -- up to 16 bits per color channel instead of the current 9 bits -- which already is more than the 8 bits that standard computer graphics hardware supports.
http://www.eetasia.com/ARTICLES/200.....URCES=DOWNLOAD
http://www.tacp.toshiba.com/dvd/pro.....p?model=hd-xa2
A screen filled with an image of the blue sky is an example where 8bits is pathetically inadequate. A computer generated image with a "smooth" color gradation from one edge of the screen to the other will show visible banding where one color changes to the next due to the ability to display only a few different colors across 1K pixels. The usual way to fake it is to use dithering: mixing pixels of one shade in among the pixels of another. The eye tends to blur them together, making the transitions less obvious. Deeper colors will reduce the need for this.
The PNG image format has always supported more than 8bits per color channel, although it normally isn't used when motion is involved. MPEG2 can be used for that.
Thanks for lending your expertise!
Recent generation graphics cards support HDMI 1.3, but I'm not sure if they actually can generate more than 10 bit video. At the moment, only ATI supports BD HD compatible audio through the same connector. NVIDIA cards seem to be limited to the audio formats supported by SPDIF (stereo PCM, DD & DTS)
Supposedly most (all?) games only support 8bit video because the programming effort to support a non-multiple of 8 is extremely difficult, and 16bit support is still much too processor intensive.
This is all 3rd hand. I have not yet found a video card spec sheet with the details of its HDMI support.
Given time things will improve, they always do... *off to food*
So far I've only seen one display that displays true HDR, and it costs around $49,000 US, or it did when I saw it, don't know about now. But it not only had the contrast ratio, but it had the actual guts needed to display true HDR. something like 200,000:1 contrast ratio if memory serves, and true 16bit per channel (48bit color). It also had a luminance range of 0cd/m² - 4000cd/m².
Its for medical imaging and extreemly high end imaging. Unless you've got 50 grand, I doubt you could get something that displays that extreem of a dynamic range.
But yes, displays which have extremely good contrast ratios and brightnesses and all that stuff are quite expensive--they use selective backlighting to give full brightness to bright areas without having to flood all the dark areas with all that extra backlighting. My perception professor is confident that they will come down over the next few years into the more affordable (several thousand dollars) range once it starts to go mainstream and more companies try making them.
That might just be what differentiates the HD of the 2005-2010 era from the equivalent of the 2010-2015 era. They'll still just offer 1080p, but they'll be pushing 120 Hz, >100k:1 contrast ratios, and 48-bit color as the must-haves.
The real numbers are the bits per channel and the luminance.
I realy hope they don't stay at 1080p, I had an insanely large CRT monitor at one time that would letter box widescreen resolutions, and it still would have a larger viewing area than most larger flatpanels. 1080p was not enough for it, I ran it at the highest wide screen resolutions it would go to, and honestly, theres much better resolutions than 1080p.
Most modern HDTVs are ok for TVs but I'd realy like to play games in 1600p like I did on my computer before the monitor died, half-life looks insane at those resolutions, screw AA, the pixels are so small it doesn't matter anyway, unless of course your monitor is insanely oversized.
Just as a note though, I played Half-life 2 at 720p or 1080p, because my graphics card coulden't handel half-life 2 at the same resolution as the original. I woulden't want to sound like I have a super computer.
The problem you had isn't the resolution, it's the aspect ratio. In fact you probably had the same or better horizontal resolution, but because they wanted to make the movie extra-widescreen for whatever reason, we still get boxes on the top and bottom. Really, I have that problem with every video out there; almost no movies are actually 16:9, much less the 16:10 that my display has. It's just something we gotta live with, unless/until they come to one standard aspect ratio D:
Like that's ever gonna happen.
Movie makers have to have somethings to distinguish their products from TV!
I don't -know- that this is true, but I suspect it is, and if it is, then you can get much more contrast bang for your buck by having pixel brightness work on a logarithmic scale instead of a linear scale. Better yet, you can probably get an excellent dynamic range without having a huge pixel depth.
What do you think?