Since NES games were developed for CRTs, they were developed with a gamma around 2.2-2.5, correct? On my LCD monitor, hues of the same luminance are much more closer to each other in value than on my CRT monitor. This makes most games (such as Bionic Commando, Vice: Project Doom, and Trojan) that use same-luminance/different hue shading ($07 as a shade of $06, $0c as a shade of $03, etc.) look worse on the LCD than on the CRT. However, it makes some games (Laser Invasion, Castlevania III, Ninja Gaiden II) look worse on the CRTs than on the LCD.
I've read that modern LCDs are supposed to have a gamma of around 2.2 - my LCD passes an LCD gamma test for around 2.2, and my CRT passes a CRT gamma test for around 2.2 also, which is why I am confused on how the perceived gamma is lower on the LCD (probably linear) than on the CRT. The gamma correction control provided from my video card is set to 1.00, which should leave the monitor's gamma to its default. My guess is that the LCD monitor itself (and maybe most LCDs in general, if mine passes the gamma test) assumes that the input is always intended for a linear gamma, so it gamma corrects it to compensate for its original 2.2. That's only my guess though...does anybody know for sure why this would happen?
My other question is why do Laser Invasion, CV3, and NG2 look better with the perceived linear gamma from the LCD...maybe these programmers worked under linear gamma? I kind of doubt it, though, since CRTs are usually preferred over LCDs for professionals.
I'm pretty sure that this isn't an NTSC issue, since gamma can effect all kinds of video, whether NTSC, PAL, digital, etc.
Most likely your LCD's NTSC composite video decoder is causing the differences, since the NES composite signal is somewhat non-standard. I get the idea that all digital TVs suck when fed analog video signals.
blargg wrote:
Most likely your LCD's NTSC composite video decoder is causing the differences, since the NES composite signal is somewhat non-standard. I get the idea that all digital TVs suck when fed analog video signals.
Actually, I meant computer monitors NES emulators for PC (with relatively accurate palettes), not TV monitors. But, having both LCD and CRT TVs, I'm guessing that this gamma problem happens with a real NES connected to an LCD TV, since the gamma is also different in a similar way between the two TVs.
strangenesfreak wrote:
My other question is why do Laser Invasion, CV3, and NG2 look better with the perceived linear gamma from the LCD...maybe these programmers worked under linear gamma?
*shrug*
Different developers, different hardware adjustments, different preferences. It's not just the NES; Capcom's CPS1-era arcade teams probably used horribly miscalibrated monitors. In MAME, and on most arcade monitors, you can see all sorts of weird glitches and color blocks in dark areas:
In fact if I reduce brightness on my computer's CRT monitor, your image appears normal. I guess the programmers somewhat assumed very dark monitors when programming the game ?
Perhaps Capcom got their monitors from the same factory that eventually made the LCDs for the original Game Boy Advance
I'm surprised that (AFAIK) very few people, if any, had noticed this gamma problem with NES games on LCDs, since I'd imagine it could really interfere in creating "accurate" palettes...one test pic to see if your monitor (or NES palette) needs to be gamma corrected would be Celius's avatar. Correct me if I'm wrong, but the Dracula form seems to use this kind of shading that's meant for the CRT gamma (2.2); it looks pretty nice with the correct gamma, but if it's too low (<1), it'll look ugly.
This is an issue I worry about when I actually display him in a game. I actually started a topic a while back about it, because after testing it out in several emulators, I found didn't look good in a lot of them. But in others, it looked good. Slight color differences can really make something look like garbage.
EDIT: Sorry, you made me want to update my avatar. For anyone who wishes to see what he's talking about:
http://www.freewebs.com/the_bott/Avatar2.JPG
The new version won't be as sensitive to slight color differences, I don't think. It's a bit of a risky change, but I'll make up my mind about whether or not I like the new look later.
Wow Celius your uprgaded avatar is a lot better than the previous one !
And I don't have much gamma problems with my NES, but have a lot of them with my PS2. Most games looks terribly dark, unless it's night it's often impossible to see anything on screen. For this reason I only play the PS2 in winter or during the night so that there is less sun light.
Some games with colorfull graphics like Dragon Quest 8 are completely spared, but some darker games are unplayable unless it's complete night.
The problem may also come from a weird AV -> RF converter that I belive was made by my dad, maybe he got something a little wrong with luminance ?
PS : In the game I'm making I use red pixels surrounded by black to simulate punk needles on my character's belt. Unfortunatlely those are invisible on my CRT screen. I hesitate to modify the sprite but I don't know what to do to have them looking good.
Quote:
In my LCD monitor, hues of the same luminance are much more closer to each other in value than on my CRT monitor.
Properly calibrated sRGB LCD monitors should look exactly like properly calibrated sRGB CRT monitors. If not, it's a calibration problem, nothing to do in particular with the NES.
sRGB has a somewhat complicated gamma. Generally, people creating palettes should use a properly calibrated monitor in a defined viewing environment.
If you're using an actual television connected to a NES as a reference, keep in mind that American NTSC uses a setup of 7.5%, meaning they are darker than Japanese NTSC at the same gamma.
Also keep in mind that on the NES, most of the 0x colors generate negative RGB values after decoding, which will have different effects on different television sets. This is actually of great importance, because this is what determines whether color 0x07 will look "brown" or "red-brown, almost red". Konami games such as Life Force will expect the latter, Ghosts 'n' Goblins will expect the former, judging by how it is used with other colors.
I have tested Battletoads's 1st level (the ground and the boss's red display) and I after playing with the brightness on both the LCD computer monitor (using an accurate palette with a PC emulator) and my CRT TV (running a real NES), I can confirm that the LCD monitor's gamma is off from the CRT's gamma.
I think the real problem here is that my LCD monitor kind of sucks. I cannot directly change gamma of my monitor without touching my video card settings, and neither the monitor itself nor my video card can change sharpness. Thus, my monitor can never properly pass the
Lagom sharpness test. In addition, I can never differentiate luminances 30-32 in this
contrast test with decent black and white levels.
For what it's worth, my monitor is an
Acer AL1715 and my computer uses a GeForce 2 Integrated GPU as a video card.
Sorry for double posting, but I tested my NES on my LCD TV; it looked wrong at 75% brightness, but after resetting to factory settings (25% brightness), it looks perfect. So I don't think the LCD TV has a gamma problem.
After reading
this, it appears that all of my monitors have 2.2 gamma; they just interpret the input's gamma differently. So according to this article, video (I'm guessing this means TVs) interpret input gamma as 0.45. I'm now guessing that my LCD PC monitor adheres to the "Computer Graphics" gamma standards, since gamma looks linear here and it is for the input images of "Computer Graphics". I think my CRT PC monitors are either following the video (or even Macintosh) gamma standards for some reason, or are part of the monitors that have "poor or no standards" as mentioned in the article.
It' sworse than that.
Many games were developed for consumer Ntsc on the NEs/Famicom.
WHat this means is that many games orginally developed in japan were made for a yellow boosted palette, which was common on JP tvs at the time.
US TVs usually boosted the REDs instead.
If a game was developed in the US, generally the "consumer" palette in a NES emu gives correct output. But if the game was originally japanese, the yellow boost should be turned on. the RGB versions of the NES chip most closely replicate the yellow boosted palette, which is why rgb modded NESes look different from normal ones. (there's also the different handeling of the color emphasis bits) ALL the screenshots in nintendo power were taken with a Famicom Titler which is effectively an RGB NES, which is why they didn't look like they did on your TV.
On a JP consumer TV, fire mario wears yellow clothes instead of white. Ths is replicated in the arcade version. ON US tvs, he wears white instead. Luigi wears white on either, though.
Nowadays, TVs dont'do that anymore. A correct FPGA NES would tr and autodetect if the game needs a red or a yellow boost to display correctly adn adjust the palette as needed.
I think I finally understand what's going on now. :D After rereading the
Wikipedia article on gamma correction, it looks like input gamma is encoded differently by PCs and TVs. As it says, PCs encode the input gamma as 0.45, while for TVs, there is usually no encoding needed; this is how I'm interpreting "does not usually require further gamma correction". So I guess that means that on most PCs, some stuff meant for TVs, including video game graphics such as those for the NES, have over-corrected gamma. Is this understanding correct?
The common gamma value of 2.2 (or 0.45 in reciprocal) includes the 2.0 from the fact that the power of the electron beam is proportional to the square of voltage under constant resistance (P = I^2*R = V^2/R). I don't know where the remaining 0.2 comes from.
My understanding is that both a properly adjusted PC monitor and a properly adjusted TV will use this gamma value 2.2. But the black level control on consumer TVs is labeled "brightness", encouraging users to set it too high, and a high black level reduces the effective gamma value.
What if the default settings of monitors aren't the proper settings? My LCD PC monitor's default settings puts contrast at 50 (out of 100) and brightness at 100. If I put contrast at 100 and brightness at 0, it fixes the gamma problem somewhat, but it actually becomes extremely sensitive to viewing angle, so some pairs of colors (such as $0c and $02) still flip in luminance depending on if you're looking at the top or bottom of the screen. My LCD HDTV's default settings sets brightness at around 33 out of 100 (there's no number) and contrast at around 67 out of 100, but there's no gamma problem.
Quote:
As it says, PCs encode the input gamma as 0.45, while for TVs, there is usually no encoding needed; this is how I'm interpreting "does not usually require further gamma correction". So I guess that means that on most PCs, some stuff meant for TVs, including video game graphics such as those for the NES, have over-corrected gamma. Is this understanding correct?
No. Ceteris paribus, PC's sRGB and TV's NTSC specifications have very, very similar display characteristics, both gamma- and chromaticity-wise.
Unless you get a good (i.e. expensive LCD or CRT) television and have it properly calibrated, any theorizing about what PCs do or don't do is a waste of time, because you never know if your monitor's decoding or your PC's encoding is crap. In 99.999% of cases with LCD monitors, it's the monitor that is crap, mostly because of ridiculous default settings. LCD PC monitors are a cancer on the world.
"Properly calibrated" means using a colorimeter. OSD presets are next to completely useless; in fact, even on expensive monitors, the "native" setting is often closer to sRGB than the "sRGB" preset setting in the OSD.
NewRisingSun wrote:
LCD PC monitors are a cancer on the world.
What would you rather have on a laptop PC?
Quote:
"Properly calibrated" means using a colorimeter.
Do the various "gamma test" images found online, such as those in
this Wikipedia article, help any?