Forums » Suggestions
Gamma correction!
Before anybody says "But there's a setting for that!" let me just say I KNOW. That's NOT what I'm talking about, and is in fact only related in the sense that both what I'm suggesting and what VO already has involve the nonlinear relation between input voltage and pixel brightness on a CRT.
Background, first. (We'll talk about grayscale CRT screens for simplicity.)
One way or another, in computer graphics, you usually end up with a number between 0 and 255 for the brightness of a given pixel. 0 is black. 255 is white. But 127 is not, in fact, half as bright as 255... The reason is that this number is used to drive the input voltage of the CRT, and does not have any other explicit meaning. 0 is the minimum voltage (the electron gun fires almost not at all, black) and 255 is the maximum voltage (the electron gun fires as much as it can, white). 127 is halfway in between, which, on a typical PC CRT, is actually just over 20% as bright as 255. If you want a color "half as bright" as white, you would need to use 186 instead.
This didn't mean much for a while, though. For an artist it doesn't matter because you're correcting for this unconsciously. (You're not mathematically choosing what color you draw with, you're picking the one that looks right.) For everyone else, it doesn't matter; the artist already accounted for it, so it looks "right." (Digital photography, etc. also looks "right" because that hardware is performing gamma correction, too.)
The problem happens when the computer generates its own images, or modifies the artists' images in some way. As, for example, with lighting, alpha blending, etc....
Most graphics programmers who want to blend halfway between 0 and 255 end up with 127, that being pretty close to the average of the two. But this "average" is actually only 20% white. It gets worse when the colors aren't extreme or the blending factor isn't 50%. This is why shadows, explosions, particle effects, etc. looked just a bit "wrong" in past-generation games (and some current-generation ones).
For a while, graphics hardware did this too. Modern hardware (any with pixel shaders, some without) is capable of accounting for this, though. It blends 0 and 255 and gets (correctly) 186. If they just turned around and did this in all games, though, it'd look "wrong" (different). So this support has to be asked for.
In OpenGL, you'd use the EXT_texture_sRGB extension to inform the renderer that some of your textures are actually gamma-encoded, and others (normal maps, for instance) are not. You'd then simply use EXT_framebuffer_sRGB extension to inform the renderer that the framebuffer is intended to be gamma-encoded, and it will do its calculations in a linear colorspace while writing pixels in the screen's colorspace.
Unfortunately, no OpenGL implementation I've yet used supported EXT_framebuffer_sRGB. However, recent versions of OSX do support EXT_texture_sRGB. A method I've used with great success is to render the entire scene at 48-bpp in a linear colorspace, then use a shader to convert that to the screen's colorspace. (It would blend 0 and 65535 and get 32767, which it then non-linearly downsamples to 186.)
It works well, with a tiny performance hit, on hardware as old as my Radeon 9800 Pro. So well, in fact, that I wrote a shim that can shoehorn such correction into most games. It doesn't work with games that use glColorPointer or that use texture maps to hold non-color data such as normal maps. Since VO is one of those games, I can't post screenshots of what it would look like corrected. However:
http://joshua.tejat.net/gammathing/
Don't ask where I got the models. :|
Sample code available on request.
-:sigma.SB
P.S. The primary graphical difference between Metroid Prime and Metroid Prime 2: Echoes is that Echoes gamma corrects while the original does not.
P.P.S. I carefully calibrate all my screens to the sRGB colorspace. VO carelessly discards that calibration on every run. Can we PLEASE have a setting to disable this gamma "correction" entirely?
Background, first. (We'll talk about grayscale CRT screens for simplicity.)
One way or another, in computer graphics, you usually end up with a number between 0 and 255 for the brightness of a given pixel. 0 is black. 255 is white. But 127 is not, in fact, half as bright as 255... The reason is that this number is used to drive the input voltage of the CRT, and does not have any other explicit meaning. 0 is the minimum voltage (the electron gun fires almost not at all, black) and 255 is the maximum voltage (the electron gun fires as much as it can, white). 127 is halfway in between, which, on a typical PC CRT, is actually just over 20% as bright as 255. If you want a color "half as bright" as white, you would need to use 186 instead.
This didn't mean much for a while, though. For an artist it doesn't matter because you're correcting for this unconsciously. (You're not mathematically choosing what color you draw with, you're picking the one that looks right.) For everyone else, it doesn't matter; the artist already accounted for it, so it looks "right." (Digital photography, etc. also looks "right" because that hardware is performing gamma correction, too.)
The problem happens when the computer generates its own images, or modifies the artists' images in some way. As, for example, with lighting, alpha blending, etc....
Most graphics programmers who want to blend halfway between 0 and 255 end up with 127, that being pretty close to the average of the two. But this "average" is actually only 20% white. It gets worse when the colors aren't extreme or the blending factor isn't 50%. This is why shadows, explosions, particle effects, etc. looked just a bit "wrong" in past-generation games (and some current-generation ones).
For a while, graphics hardware did this too. Modern hardware (any with pixel shaders, some without) is capable of accounting for this, though. It blends 0 and 255 and gets (correctly) 186. If they just turned around and did this in all games, though, it'd look "wrong" (different). So this support has to be asked for.
In OpenGL, you'd use the EXT_texture_sRGB extension to inform the renderer that some of your textures are actually gamma-encoded, and others (normal maps, for instance) are not. You'd then simply use EXT_framebuffer_sRGB extension to inform the renderer that the framebuffer is intended to be gamma-encoded, and it will do its calculations in a linear colorspace while writing pixels in the screen's colorspace.
Unfortunately, no OpenGL implementation I've yet used supported EXT_framebuffer_sRGB. However, recent versions of OSX do support EXT_texture_sRGB. A method I've used with great success is to render the entire scene at 48-bpp in a linear colorspace, then use a shader to convert that to the screen's colorspace. (It would blend 0 and 65535 and get 32767, which it then non-linearly downsamples to 186.)
It works well, with a tiny performance hit, on hardware as old as my Radeon 9800 Pro. So well, in fact, that I wrote a shim that can shoehorn such correction into most games. It doesn't work with games that use glColorPointer or that use texture maps to hold non-color data such as normal maps. Since VO is one of those games, I can't post screenshots of what it would look like corrected. However:
http://joshua.tejat.net/gammathing/
Don't ask where I got the models. :|
Sample code available on request.
-:sigma.SB
P.S. The primary graphical difference between Metroid Prime and Metroid Prime 2: Echoes is that Echoes gamma corrects while the original does not.
P.P.S. I carefully calibrate all my screens to the sRGB colorspace. VO carelessly discards that calibration on every run. Can we PLEASE have a setting to disable this gamma "correction" entirely?
My name is Liath, and I approve of this post.
Well thought out!!
Well thought out!!
Go Solra! Whooo! Yay for gamma correction!
I bestow upon you the "Knoll award for outstanding Gamma Correction"!
I bestow upon you the "Knoll award for outstanding Gamma Correction"!
Yeah, this is an (older) interesting article about gamma, too.
http://renderwonk.com/blog/index.php/archive/adventures-with-gamma-correct-rendering/
http://renderwonk.com/blog/index.php/archive/adventures-with-gamma-correct-rendering/
Interesting article. (It talks about banding... 16-bit LDR is not quite enough, as you can see from, for instance, the shadows on the Prometheus)
It also linked to an unrelated awesome article on premultiplied alpha, which I am now in love with.
So are you ready to email me the refgl source code so I can do the legwork for you yet? :D
-:sigma.SB
It also linked to an unrelated awesome article on premultiplied alpha, which I am now in love with.
So are you ready to email me the refgl source code so I can do the legwork for you yet? :D
-:sigma.SB
Bump, because i have a CRT now and I actually care!
Oh wait wrong thread...
Anywho Vo resets my gamma correction setting every time i quit.
tis a pain in the arse having to reset ATI CCC to make my gimp pictures accurate when i get out of game.
Oh wait wrong thread...
Anywho Vo resets my gamma correction setting every time i quit.
tis a pain in the arse having to reset ATI CCC to make my gimp pictures accurate when i get out of game.
there was a bugs thread about this... and it was fixed. Go make a bug thread!