I was discussing the NES video output and realized that it's actually not 240p. It's 480p with every other line black, and at double the brightness of a normal picture on the same TV. That is, if you had a 480i60 CRT and a 480p60 CRT, you could get the 480p60 CRT showing an identical image as the 480i60 CRT showing a NES only if you did the above two things for the NES image. And if you had a true 240p60 monitor, you would not be able to get an image that looks like a NES, as your scanlines would be way too fat vertically.
The NES clearly isn't merely 240p, because you get very noticeable black scanlines between everything. In a way, these black lines are part of the picture. And because each scanline is painted 60 times per second, rather than the usual 30 it would get for a normal interlaced signal, it's effectively twice the brightness a normal picture's scanlines would be.
I think my brain had an error...
The hsync is still 15.something KHz, there's no 480p
and if it wasn't 15KHz my TV would surely complain about it...
The way a TV displays a so-called 240p signal is as if it has an extra black scanline between each displayed one, and this is integral to the appearance of old consoles that used it. Thus, they are really more like 480p with every other line black.
Interlace mode is displayed by having a half line, which then "restarts" the horizontal ramp, which translates to a small "delay" and the lines of new field end up in the middle of old. The progressive mode has no half lines, or even number of them so they cancel out each other and the lines of new field go on top of old.
In both cases the Hsync signal that defines line length (and amount of lines within vertical ramp, which is restarted by a Vsync pulse) is the same.
Operation of a CRT is so deviously simple... 2 ramp generators and some high voltage (and a bit of amplification).
For 480"p" you need 526 or 524 lines per vertical ramp, you only get 262 or 263, depending on source device.
Hsync speed = lines * fields
Since Hsync is around 15.something Hz you can only have around ~260 lines in case of 60 fields.
...and is that a serious topic or something I have missed at some point....?
I... can't really tell if this is tongue in cheek or if you are being serious! The NES very surely puts out a 240p signal. As Tiido mentioned hsync is ~15Khz for sure; to be 480p it would have to be ~30khz. The black scanlines are just... scanlines; scanline is a poorly chosen name as it really describes the gaps between the lines of rendering. You'll notice that brighter lines often appear fatter, and on a well converged and focused CRT monitor or television the lines this gap won't be so visible.
I promise, it's a 240p signal
Let's go to a different hypothetical example that's not the NES, so there's no preconceptions muddying things up.
PC monitors support interlaced and progressive signals. In interlaced mode, each field's scanlines are double spaced as compared to a progressive frame's scanlines. Each field scans every other line, leaving space for the interleaving lines that will be done on the next field. Let's say that this is a fixed-frequency monitor supporting 480i and 480p.
If you trick the monitor into displaying only one kind of field every time, your image's scanlines will be double-spaced. So if you were displaying an image with 240 scanlines, they'd take as much space vertically as 480 would in the monitor's progressive mode.
Now, let's say that you had another 240p-only monitor (no support for 480 at all). If you displayed the above image on it, it wouldn't look the same; the scanlines wouldn't have enough space between them. When you tricked the first monitor into displaying those 240 scanlines in the same vertical position every frame, you still had 240 black scanlines between them. But the 240p-only monitor's electron spot size is adjusted so that the scanlines mesh up against each other.
Now, let's say that you have a third monitor that only does 480p, no interlace support. Can you display the same thing you got on the first monitor? Yes, by showing a black scanline between each normal scanline. This will give you exactly the same appearance, since it's visually what happened on the first one when you tricked it to show the same field every time.
Thus, a 240-scanline image shown by tricking an interlaced monitor into showing the same field over and over is not the same as a 240p image on a 240p monitor, but is exactly like a 480p image made up of the 240 scanlines of the image interspersed with 240 black lines.
Thus, my claim that that the NES image is only properly represented as a 480p image, and that it's more than a mere 240p image. It's irrelevant what the video signal carries; what a TV tricked into displaying a progressive image displays is as if it were showing black scanlines between each one from the NES. If you want to focus on the video signal, then these are encoded into it by way of every frame having the same number of scanlines, without the marker every other one to tell the TV to alternate fields.
Also the claim that it runs at twice brightness is related, since the TV normally only illuminates a given scanline 30 times a second, where here it's illuminated 60 times a second.
Only difference between progressive and interlace is lack of half lines. 480p monitor is able to do 960i by design (if we speak about analog stuff), and so is 240p monitor doing 480i.
The so called progressive image happens only because of missing half lines, making your field go on top of old not inbetween in, leaving unscanned areas on the screen that we know as scanlines...
Scanlines only happen because the beam is not physically high enough to cover all the screen in the intended scan pattern. But you can de-focus the beam so it will cover up the non-scanned area... but it will look crappy with interlaced signal (lots of detail loss vertically).
If you intend to show 240 lines on a monitor that only takes 480 lines you have to add a blank line after every real line in the source material, but that is impossible in realitme, you got to add a (digital) buffering scheme...
You cannot make a 480 line only monitor show 240 line source material, the monitor will either go off sync or somehow shows 2 fields on top of each other, with half the effective frame rate.
And showing 480 line content on 240 line monitor results in something like this if you are extremely lucky :
http://www.tmeeco.eu/BitShit/VGAtoTV.JPGBut usually it is something like this :
http://www.tmeeco.eu/BitShit/VGAtoTV2.JPG
It is true that conventional SDTV CRTs are only designed to display 480i, and that basically all 4th-generation and older consoles abuse them into displaying a progressive image. The electron beam diameter is tuned for 480i display: in this regard it is absolutely true that there are black lines between everything. However, smaller televisions can't really display 480i: there is a certain minimum size that the electron beam gets.
Similarly, end-of-generation multisync CRT monitors were tuned to display 1600x1200 or whatever, and when they were displaying particularly low resolution video modes (720x400 text, 320x200 "MCGA" graphics, 640x350 "EGA" graphics) you got the same "black lines" effect. It's misleading to say that these monitors' "true" resolution was anything in particular: they were continuous-space devices, with a blur kernel that was a function of the aperture grill and electron gun beam emission surface. In the same way, a SDTV CRT doesn't have a "true" resolution of 480p, and using any simple vertical-only blur filter on a 240p-upsampled-to-480p input is similarly inaccurate.
This is about TV standard. It describes that two neighbor frame fields must differs by one scanline, that enables interlace mode in TV set, which was conceived originally. But what if we feed to TV two exactly same frame fileds? We get progressive scan with half of resolution but at double framerate. NES PPU generate exactly same odd and even frame (we get this info by decap), so it is true 240p device. But every modern TVs (especially LCD) understand the original TV format is literally trying to deinterlace progressive signal and get ugly vertical stretch (instead of beauty scanline) at half of framerate.
Think about it.
I think what blargg is trying to say is that the CRT's electron beam is focused differently. It's focused one way for monitors designed for 240 lines (such as CGA and 8-bit home computers) and another way for monitors designed for 480 lines, be they interlaced or progressive (such as TVs and early VGA monitors). It's like how with the old CRT PC monitors, you can see the spaces between scanlines more clearly when running an 350p to 480p DOS game in full screen (in actual DOS, not a VM) than on your 768p Linux or Windows desktop, because the beam is focused for the 96 scanlines per inch that desktop environments assume.
I would not compare the quality of the mask in a regular TV and even the most simple monitor. Monitor has less size of grain of phosphors and the CRT beam is focused much better. I had the experience of using a CRT monitor tube in the TV same diagonal. The results are impressive.
Just to be sure, does that imply that the resulting NES display on a common CRT shakes a little vertically, due to interlacing?
The NES does not shake on a CRT unless it's run through a device with a frame buffer that converts everything to interlaced, such as my DVD recorder.
Jarhmander wrote:
Just to be sure, does that imply that the resulting NES display on a common CRT shakes a little vertically, due to interlacing?
No, because it tricks the TV into displaying the same field every frame, rather than alternating.
No, on top of old, instead of between it.
Could you clarify your "no"? My understanding is that a proper interlaced signal basically has a flag every other frame (never mind how it's encoded) that causes the display to offset the field half a scanline vertically, and that these these offset and non-offset fields are referred to as even and odd (again, irrelevant which is which for this post). The NES doesn't set this flag every other field, and thus the TV always displays the same field every frame. Hence my comment, it shows the same field (even/odd, not relevant here which) every frame. The same field implies that its scanlines are always on the same physical position on the CRT face, since that's what the even/odd means.
Scanlines at the same position each field is the definition of progressive. There are 240 of those lines, making the signal 240p. Yes, a given CRT will have more prominent space between the scanlines when displaying a 240p signal than when displaying a 480i signal. No, that doesn't make the signal any less 240p, just as a 350-line picture from a VGA card in full-screen EGA-compatible mode doesn't suddenly become 700p with every other line blank when displayed on a VGA monitor tuned for 768p.
There are no flags, only thing that causes interlaced signal is a half line. That half line causes a "restart" in the horizontal ramp generator. Ramp is fed to the deflection circuit and it moves the beam. Side effect of that half line is that new line will be half a line shifted down, and the newly coming lines will end up between the old ones. Remember that there is always a vertical ramp happening... it is smooth sawtooth like horizontal one, there are no "staircases" and this is the reason for the half a line vertical shift on the half length horizontal line. If the line is less or more than half then the new field will be shifted as much up or down.
If You omit that half line or have even number of them the new field will just go on top of other. Even number will result in equivalent of full lines.
Half line offset in the vertical sync pulse is what is being called the "flag", as I understand it.
And to a modern digital TV, it is nothing more than a really clunkily-encoded flag (assuming it even recognizes pseudo-progressive signals like the NES puts out). Sorry if this whole thread was too much of an exercise in abstract thinking. It stemmed from my insight that one reason that NES images are generally shown in a way that doesn't match the TV is that these implicit black spaces between scanlines aren't accounted for. By showing how the interlaced fields are effectively skipping lines (sorry, more abstract thinking), and that the NES shows only one kind of interlaced field, it's clearer how there is something more than just 240 scanlines if one is displaying it more how a TV would. For some reason I hadn't thought this all the way through before and hadn't fully registered this essential scanline spacing (even though it was painfully clear on larger CRT TVs how noticeable the black spaces were).
Modern TVs ignore them and deinterlace regardless of half lines, actually most of them are incapable of detecting the half lines in the first place (thus forced deinterlace, as most input is interlaced anyway). It is actually hard to detect them in a cheap and consistent manner in digital domain, mainly cheaply...
Another complicating factor is that the size of the electron beam is a function of brightness and CRT age—the brighter the setting and older the electron gun, the wider the electron beam. A brand new 480i CRT with the brightness turned down will have more more visible inter-scanline gaps than a tube that has the brightness turned up and been on for 100k hours.
But these problems would be true regardless of whether you were feeding the set 480i or 240p, so the empty interstices point is still true.
TmEE wrote:
It is actually hard to detect them in a cheap and consistent manner in digital domain, mainly cheaply...
Really? Isn't that the point of the double-rate serration Vsync, to identify interlaced content?
lidnariq wrote:
But these problems would be true regardless of whether you were feeding the set 480i or 240p, so the empty interstices point is still true.
When you feed the set 480i, it draws each field in the gaps between the other field's scanlines. Combined with persistence of vision, this results in an illusion of there being no gaps.
As for how this relates to a "scanlines" filter on an emulator, I have some ideas on how to implement that, involving a pixel shader with quadratic response.
lidnariq wrote:
TmEE wrote:
It is actually hard to detect them in a cheap and consistent manner in digital domain, mainly cheaply...
Really? Isn't that the point of the double-rate serration Vsync, to identify interlaced content?
If a decoder couldn't detect which field was which, it wouldn't be able to reliably even decode an interlaced signal (half the time it'd get them wrong and look awful). So I'd say that even current digital TVs are decoding this flag reliably.
I wish I knew how to program GL/DX shaders so that I could make a more authentic-looking "scanlines" filter incorporating brightness-based bloom and other concepts expressed in this topic. What I could do is develop a reference implementation the filter in Python+PIL and AviSynth, make a video of it in action (as I
did recently for superimposing NES video on an animated background), and then later have one of you implement it in an emulator. Should I?
Some people have put some effort into this:
http://filthypants.blogspot.com.au/2012 ... hader.htmlhttp://ascii.textfiles.com/archives/3786Neither of them seem to mention the non-linear instantaneous brightness->beam shape distortion, though.
blargg wrote:
If a decoder couldn't detect which field was which, it wouldn't be able to reliably even decode an interlaced signal (half the time it'd get them wrong and look awful). So I'd say that even current digital TVs are decoding this flag reliably.
You can count loss of sync on the PLL, and then decide what field is what, but you won't get info when exactly was the sync lost. That is the reason why different fields have more half lines, and different amount between the two. It was mainly to help video editing apparatus the function more effectively. For a CRT TV it does not matter, it just works as a consequence, there is no way to mess up field order in any way, as long as there is at least one halfline you are guaranteed to have proper shift on the fields.
A friend of mine in working on an upscaler device aimed at retro game consoles and dealing with this problem really is not easy without complex techniques. But since it is retro aimed, it can assume all info is non-interlaced.
In CRT TV all happens much easier. Who of you really know, how works vertical deflection system?
1. It's all start from sawtooth generator. It consist of capacitor, constant current generator (for capacitor charge) and capacitor discharge switch. Current generator guarantees linear charging of capacitor. All you need is use switch before the capacitor get full charge. Obviously switch used every VLBank.
2. Result of sawtooth generator feeds to amplifier, that drives vertical deflection coils. Nothing special, except very high requirements to linearity of amplifier. That's all.
3. TV standard describes different length of even and odd subframes by one scanline. So, VBlank period is a bit different between each other. Now, read back chapter 1: capacitor are same, charging current are stable, so in shorter subframe capacitor get a bit less charge, that cause less voltage on it. Amplifier works only with AC, that cause signal shift output signal relatively zero. This shift is interlace. Notice that if even subframe different by 2 scanlines, that cause only redraw almost all odd scanlines, except one on top and one on bottom. That's why must different be only by one scanline.
Ofcourse this cause jitter of vertical deflection frequency, but average frequency are stable. Thus, there are no "odd/even flag", and DSP are simple count scanlines between vertical sync. And I remind you my previous words: NES PPU generate every frame with constant scanline count. Is that clear? If you don't believe, you always can read TV standart drafts or even look at oscilloscope at broadcast signal.
There is an odd/even flag in an interlaced composite signal. A flag is just something that can be in one of two states; its representation can be anything, not just what one conventionally thinks of as a flag. This kind of flexible thinking allows application of patterns to more situations and thus greater mental abilities. It's true that the even/odd field differentiation wasn't thought of as a flag back when it was created, since it was basically driving the TV's deflection circuit in a sense, causing it to offset half a scanline height. This doesn't prevent us from seeing it as a flag now when reasoning about the signal in the abstract. For the SNES, which can do progressive or interlaced based on software control, it is little more than a flag. "If in interlaced mode, every other frame we need to output a special section in the video signal to tell the TV that it's the other field type."
blargg wrote:
For the SNES, which can do progressive or interlaced based on software control, it is little more than a flag. "If in interlaced mode, every other frame we need to output a special section in the video signal to tell the TV that it's the other field type."
There no "special section" in TV signal. Your flag is need only for control software to synchronize to video signal. It's generated by video processing device. TV doesn't require any flag. Analog TV get correct picture by natural method, digital TV - just by count scanlines between VSync signal (with additional postprocessing digital TV has output lag). Result of scanlines counting you can consider as your "flag".
@HardwareMan: your diagrams cleared some of the misunderstandings of of the NTSC interlaced video. Maybe the literature I was referring to was oversimplified and never noticed that scanlines weren't exactly horizontal. Thanks.
The deflection coils are usually tilted a bit so the lines do appear horizontal to you on the screen...
TmEE wrote:
The deflection coils are usually tilted a bit so the lines do appear horizontal to you on the screen...
Indeed. Also, there is a special block in CRT TV (especially for big size) - geometric distortion correction unit. The most common: trapezoid, parallelogram, barrel and pillow:
This happens for different reasons, for example different distance from beam source to the screen. So, to correct this visual geometric distortion, some nonlinearity inject to perfect linear signal (usually weak crosstalk between horizontal and vertical deflection signals). So, vertical signal looks like this:
And horizontal like this:
Of course, this needed only for CRT TVs, and especially with flat screen (they also need dynamic focus because distance difference is very big).
HardWareMan wrote:
blargg wrote:
For the SNES, which can do progressive or interlaced based on software control, it is little more than a flag. "If in interlaced mode, every other frame we need to output a special section in the video signal to tell the TV that it's the other field type."
There no "special section" in TV signal. Your flag is need only for control software to synchronize to video signal. It's generated by video processing device. TV doesn't require any flag. Analog TV get correct picture by natural method, digital TV - just by count scanlines between VSync signal (with additional postprocessing digital TV has output lag). Result of scanlines counting you can consider as your "flag".
You're stuck in concrete thinking. I don't think you can understand what I'm saying.
blargg wrote:
You're stuck in concrete thinking. I don't think you can understand what I'm saying.
I've re read all this thread again. Especially your posts. Your question is: why digital TVs draws NES picture with deinterlace instead of low res progressive at double framerate. I'm right? If so, TmEE already gave you
answer.
This thread wasn't to ask a question, it was to show that the NES picture isn't properly drawn as if it were a plain 240p signal, rather more like a 480p picture where ever other scanline was black. As I elaborated earlier, this is because it's using a single field that's meant to be interlaced with another whose scanlines are vertically offset by half.
But that offset is not happening, there is none of that in the video signal that leaves the NES...
Again, to make sure I understand what is said so far:
An Atari 2600, Odyssey 2, Intellivision, ColecoVision, Apple II, Commodore 64, NES, SMS, and the vast majority of Sega Genesis, Super NES, PlayStation, and N64 games all produce a picture that can be described as 240p. But a monitor focused for 480i displaying a 240p picture will have gaps between the scanlines that are as prominent as those of a monitor focused for 480p displaying a 480p picture where every second line is black. These scanline gaps are essential to the correct look of fifth-generation and older consoles.
blargg: Please confirm or deny that I understand what you're saying.
Yes, tepples, that summarizes it.
blargg wrote:
This thread wasn't to ask a question, it was to show that the NES picture isn't properly drawn as if it were a plain 240p signal, rather more like a 480p picture where ever other scanline was black. As I elaborated earlier, this is because it's using a single field that's meant to be interlaced with another whose scanlines are vertically offset by half.
The trick is the kinescope beam draws only 240 visible scanlines for NTSC (288 for PAL). And kinescope shadow mask (with phosphor screen) has own resolution, which much bigger to fit 480i/576i resolution. Yes, each scanline lights several phosphors in height, even with perfect focus. That's true for color TV, BW TV has no concept "resolution" at phosphor screen. Of course this all true for CRT TVs.
Next, you say literally: "more like a 480p picture where ever other scanline was black". Maybe its look like, but it is not. It is 240p picture with gaps between scanlines. This gaps formed because vertical deflection does not make shift between subframes. To force it do it - add one scanline to even or odd subframe. But NES PPU don't do it. So, true 240p picture at 60 FPS (50 for PAL).
According this information, why you consider my previous posts are almost offtopic? If you don't understand how it exactly works and want to discuss it, it's time to learn it up, right? If you're going to know the basics, the question does not arise.
Nothing personal, only finding the truth.
First off, my focus is on what a person sees on the TV; the technicalities of it are irrelevant regarding the point I was making.
My main point is that an SD NTSC TV is basically a 480-scanline display, where half the scanlines are drawn in one field, and the other half the other field. So when the NES displays only half the scanlines every frame and never the other half, it's effectively displaying a 480-scanline image where every other one is black. Now remember, this is about what you see, not what the TV is doing behind the phosphors, or what the NES video signal looks like.
Related to that main point is that the image is roughly double the brightness, since each of its scanlines is being refreshed twice as often as scanlines usually are in an image on that display.
Both of these are central to emulating a NES. The first explains why you can't just treat it like any old 256x240 image, and the second why we have significant trouble with it coming out too dark when we put black between scanlines, and why the TV has little trouble.
Regarding scanline spacing, I was conceding that an interlaced display probably has a slightly fatter beam vertically so that scanlines merge into each other more than they would on a progressive-only display. This weakens my point slightly, because the black space between scanlines is less than it would be on a 480p display with every other line black. The phosphors weren't my consideration, since phosphors can be partially lit anyway (especially on a Trinitron or even the rectangle triad arrangements, as opposed to dots).
I re-read the first post in this thread and I see that it comes off as talking of the NES video signal. Sorry for the lack of clarity. I was taking the 240p to be largely the image that appeared, rather than what I take it to mean now and simply the unseen encoding, rather than how that's presented.
To simulate the scanline gaps without losing brightness, we have to simulate the beam spreading. Here's the title screen of SMB1 with a very simplistic model of beam spreading. Click the attached image below.
blargg wrote:
I was taking the 240p to be largely the image that appeared, rather than what I take it to mean now and simply the unseen encoding, rather than how that's presented.
OK. You want to get 480p @60 FPS (for PAL @50 FPS)? Because NES can update every subframe independently. I see. For scanline forming you can peep to Pete, who did ePSXe plugins. Hi's did ajustable brightness for gap scanline (in percent of picture scanlines: 100% mean copy scanline, 0% means black). At 50% it looks nice.
*update*
Actually, CRT beam haven't sharp edges. It has round shape (in ideal focusing), so it produce scanline with gradient vertical luminance. So, to copy this effect you must apply luminance mask (maximum at center an minimum at edges of scanline). And move scanlines a little bit closer to each other.
You need quite large resolution to simulate this, and use variable line height to simulate different kinescope sizes... Beam is round as HardWareMan said and it lights a bigger area as the brightness increases. On a small screen (21" or less) the whites fill up the blank gap entirely and also smudge into neighboring lines. On larger screens the biggest difference that you'll have mostly visible gap regardless of brightness and the gap is very big on lower brightness.
When you have a white line that has grey to black gradient in its end it will look pointy not blunt. A grey line with white dots will look like it has bumps on it.
I used my TVs vertical stretch/compress feature to "simulate" different screen sizes.
http://www.tmeeco.eu/BitShit/TopGun0.jpghttp://www.tmeeco.eu/BitShit/TopGun1.jpghttp://www.tmeeco.eu/BitShit/TopGun2.jpgHere is same stuff but at lot lower brightness/contrast. You can see how the lines are all more defined.
http://www.tmeeco.eu/BitShit/TopGun3.jpghttp://www.tmeeco.eu/BitShit/TopGun4.jpghttp://www.tmeeco.eu/BitShit/TopGun5.jpg
blargg wrote:
Regarding scanline spacing, I was conceding that an interlaced display probably has a slightly fatter beam vertically so that scanlines merge into each other more than they would on a progressive-only display. This weakens my point slightly, because the black space between scanlines is less than it would be on a 480p display with every other line black. The phosphors weren't my consideration, since phosphors can be partially lit anyway (especially on a Trinitron or even the rectangle triad arrangements, as opposed to dots).
Certainly it'd be intriguing to see old console video nearer to its native form drawn and output.
In that sense I'm somewhat disappointed about how we lack decent quality 240p progressive-only displays.
By any stretch, far better is possible than Sega Nomad! (to be fair as a non-technical user I don't know what approach it uses for Sonic 2 in 2-player mode)
theclaw wrote:
By any stretch, far better is possible than Sega Nomad! (to be fair as a non-technical user I don't know what approach it uses for Sonic 2 in 2-player mode)
Sonic 2 in 2-player mode use VDP interlaced mode, which is documented.
Then what does Nomad do with VDP interlaced mode?
The VDP is vanilla MD stuff, no modifications.
What happes is that nothing happens, the LCD controller that takes raw analog RGB + Csync does not support interlacing and new field goes on top of old like in non-interlaced scan. Result would be flickery but since the stock LCD is so slow all stuff blends together.
Speaking of side-by-side, how about a NES shown on a 13" TV versus a 20" versus a 27", with the pictures of each scaled so the tubes look the same size. That'd show how the effective scanline spacing increases the larger the TV (I know that it's somewhat unpleasant on my 27").
blargg wrote:
Speaking of side-by-side, how about a NES shown on a 13" TV versus a 20" versus a 27", with the pictures of each scaled so the tubes look the same size. That'd show how the effective scanline spacing increases the larger the TV (I know that it's somewhat unpleasant on my 27").
This can still depend on the television; My 27" CRT has the beam focused such that the scanlines for 240p content are not very strong.
I don't have any small TVs, all are 28" or bigger...
I just remembered that there are lots of CRT TVs in the house I'm in. Three 13", three 27", and a 32". Too bad I gave away a 20" recently. I'll have to connect a NES to them and take some photos.
Blargg, if it makes any differences, I understood what you were talking about, I just didn't have much to say.
This is actually interesting to think about; 2x bright means that the phosphors are being stimulated a lot more than a regular television signal, which is why the colors of an NES look so distinct and vibrant on a CRT. But, if you were somehow able to feed the NES's video signal into something that manually inserts its own interlacing v-sync signal, I'll bet that'll make the colors change slightly, even if it's displaying the same picture.
I just realized that this 2x brightness is a big reason they had that warning about playing it on projection TVs, since those internally have CRTs which already run at a really high brightness compared to normal CRTs. It wasn't just that games show static images (score, or entire screen for puzzle games), it's that they were drawing each scanline twice as many times per second as an interlaced signal.
blargg wrote:
I just realized that this 2x brightness is a big reason they had that warning about playing it on projection TVs, since those internally have CRTs which already run at a really high brightness compared to normal CRTs. It wasn't just that games show static images (score, or entire screen for puzzle games), it's that they were drawing each scanline twice as many times per second as an interlaced signal.
If you would like me to hook up my oscilliscope I can do so... this seems incredibly founded in speculation. The scanline is not being drawn twice as many times per second as an interlaced signal. It is drawing roughly the same amount of times as a SNES, Mega Drive, PC-Engine...
TmEE wrote:
I don't have any small TVs, all are 28" or bigger...
It's monochrome, but I have a small 12" Zenith monitor that takes a Y input (signal + sync); would it be useful to take pictures of it showing anything?
monochrome does not count due to design, but i came rainbow seeing your pic
Going over same area on the phosphor does not really make it brighter in non-interlaced scan. Maximum intensity remains the same. By The time the next field comes the phosphors are dark. It takes less than quarter of the screen for phosphors to dim down completely. It is all down to presistance of vision.
The light curve the phosphors exhibit is completely logarithmic. Very short peak and a rapid decrease in brightness. This is the reason the image is not perceived as flickery when the brightness is turned down on a CRT, you stay more in the gradual part of the slope.
Maximum brightness, yes, but the perceived brightness should be higher because the RMS brightness is twice as much.
The absence of flicker has as much to do with ambient lighting levels as with shape of discharge: a very dim CRT (if you could see it at all) in a very bright room will flicker perceptably.
I also don't know whether the phosphors used in old CRT-based projectors are different (and so have a different half life) than those in forward-view sets; necessarily older sets made before P22's discovery did.
Quote:
blargg wrote:
I just realized that this 2x brightness is a big reason they had that warning about playing it on projection TVs, since those internally have CRTs which already run at a really high brightness compared to normal CRTs. It wasn't just that games show static images (score, or entire screen for puzzle games), it's that they were drawing each scanline twice as many times per second as an interlaced signal.
If you would like me to hook up my oscilliscope I can do so... this seems incredibly founded in speculation. The scanline is not being drawn twice as many times per second as an interlaced signal. It is drawing roughly the same amount of times as a SNES, Mega Drive, PC-Engine...
I'm baffled by how hard this is to communicate. Here are two frames; the left half is progressive, the right half interlace. The scanlines are numbered.
Code:
Frame 1: Frame 2:
prog int prog int
1 ******** ****
2 ****
3 ******** ****
4 ****
In progressive, scanlines 1 is drawn twice, once in each frame. In interlace, scanline 1 is only drawn once during these two frames. Progressive draws the scanline twice as many times per second. Hence, more (twice?) burn-in per second as an interlaced static image, and only of every other scanline, leading to irregular aging.
TmEE wrote:
Going over same area on the phosphor does not really make it brighter in non-interlaced scan.
Please explain this picture (2x_bright.nes):
Attachment:
compare.JPG [ 13.29 KiB | Viewed 1794 times ]
In it, the NES is illuminating every other line. On odd frames, every other 8-pixel column is shifted down one pixel. So in effect, you get a magnified comparison between progressive and interlace side-by-side. The appearance is of the stable lines being twice the brightness as the ones illuminated only once every other frame. At a distance, the whole screen looks like a solid shade, with no vertical columns visible. Since on average only half as many lines are composing every other column, they must be twice as bright or else the column wouldn't appear the same shade as the interlaced one.
Code:
Even frames Odd frames
1 **************** ********
2 ********
3 **************** ********
4 ********
Appearance
1 ********--------
2 --------
3 ********--------
4 --------
I only see this effect here when I considerably turn down brightness.
I would say it varies from TV to TV. Only other TV I have here is an LG 100Hz wonder that deinterlaces any input... digitally... with rather blurry result and input lag, making games hard to play.
EDIT: I tried it in 50Hz, and I see the effect at near normal brightness too. I still had to lower it a bit. It is said the TVs in EU have longer glow on the phosphors, and it is probably what is happening over here, that diminishes the effect (and which is why I said that it should not really make the image brighter...).
Here's a quick test:
I set my camera (Canon non-ELPH S100, ISO 80, F/8.0, 1/30s) on a "tripod" (chair), started a game using FCEUGX (which supports toggling between 240p and 480i), and cropped out a portion of both. 1/30s exposure was chosen to combine two fields.
All the differences on screen, including the brightness difference, is visible in the green and blue phosphors. The red phosphor used in CRTs is known to be very different; (
http://en.wikipedia.org/wiki/File:CRT_phosphors.png ) I have no other explanation.
blargg wrote:
Please explain this picture (2x_bright.nes):
Optical illusion?
If you ask how it can be captured by camera, I can't explain but I saw different strange things. It depends on how sensor of camera are made.
Just an variant, did you measured brightness by any device?
Whatever floats your boat.
Just for kicks, let me see if I can reword Blargg's original explanation so it's easier to grasp.
A CRT TV expects a 480i signal, so the screen has enough space to draw 480 scanlines on the screen (ignore overscan for right now). The way you draw a 480i picture is by telling the TV you're sending the odd scanlines (and then you send them), and then telling the TV you're sending the even scanlines (and then you send them). This means, every vblank, you alternate between drawing the odd scanlines and the even scanlines. Yes, that means you can only draw a full 480-resolution picture at 30 frames per second, not 60.
Instead of alternating between drawing the odd scanlines and drawing the even scanlines, the NES creates a "240p" resolution by repeatedly telling the TV that it's sending the odd scanlines. So, in place of drawing even scanlines, it draws on the odd scanlines again. That means, the odd scanlines are drawn at 60 frames per second, instead of the usual 30 frames per second. Because of that, the phosphors get drawn on twice as much as they're supposed to. That doesn't necessarily mean they are twice as bright, but they're most likely brighter than 1x.
The TV doesn't know that this is happening, because it only does what it's told. The SIGNAL tells it to draw the even or the odd scanlines. If the signal is only telling it to draw odd scanlines, then that's what happens. The odd scanlines get updated at twice their usual rate, while the even scanlines stay blank. There's no "command" you can send to the TV to tell it to switch into some kind of 240p-mode, so the scanlines are always skinny enough to allow 480 of them on the screen, even if the signal is only using half of them. That's why you see those familiar black lines on the screen.
So in summary: The TV always expects to be drawing 480 scanlines on the screen. To draw 480 scanlines, you alternate between odd and even scanlines each vblank, which creates a refresh rate of 30 fps. Instead of drawing even scanlines, the NES draws OVER the odd scanlines again, creating a refresh rate of 60 fps, but a vertical resolution of 240. Since the odd scanlines are being drawn twice as often as usual, they're brighter (not necessarily 2x, but most likely brighter than 1x). Since the TV is still expecting to draw 480 scanlines, the odd scanlines are skinny enough for the even scanlines to fit between them, even though the even scanlines never get drawn (they stay blank).
Drag wrote:
That doesn't necessarily mean they are twice as bright, but they're most likely brighter than 1x.
To look at it another way: The
total time spent drawing is the same, regardless of whether proper 480i or "240p" input is given. So the
total brightness is the same. But because every scanline is either drawn twice or never at all, the illuminated scanlines will emit twice as many photons, and in human logarithmic perception, a single scanline's brightness will be seen as somewhere between 1x and 2x brighter.
Left would be an "interlaced" line, rest would be "non interlaced".
Absolute brightness won't increase, average over some period of time would. To increase absolute brightness you would have to increase beam intensity and that is not going to happen without messing with the TV settings or video signal strength.
OK, your idea is very clear: PWM effect. And to be honest, square of glowing phosphors are different in 240p and 480i modes.
Drag, thanks for your rewording, especially the part about the TV having no "command" for progressive and thus leaving room for the other field even though it never comes.
The test ROM I posted shows that they are twice as bright, because at a distance the TV looks like a solid shade, even though the 8-pixel-wide columns alternating between being drawn as progressive and essentially interlace (magnified).
I'm not sure why technical definitions of brightness (peak) are being brought in, since they are not relevant to the appearance to a human (or camera with a 1/30 sec or longer shutter). If we talk about what's technically happening, there is a big burst of immense brightness, then near black for a relatively long time. If we zoom in even more, each phosphor molecule is probably outputting TOTAL BLACK, then suddenly a single photon, then total black, then another photon, etc. such that there are is only a binary of brightness, either nothing or a measly photon's worth (OK, so the wavelength of the photon determines energy, so the red ones have less than the blue ones AFAIK). So clearly it's impossible for there ever to be anything brighter than a photon, unless you look at the aggregate over time... which is exactly what's being done when you look at it in say 1-msec segments and see a rising and then falling profile of illumination, and when I look at it at 1/30 sec segments of time and see a higher average illumination for the progressive lines. It's just a question of the arbitrary length of time you measure the number of photons emitted.
There is no correct level to look at what's "really" happening. Looking at it at the millisecond, aggregate level is no more correct than the atomic level (or subatomic), or the human perception/camera level. There is a level that's relevant to understanding and recreating what humans see when playing a NES on a TV (and why), and that's the level I was discussing things on.
From a purely photometric standpoint, each scanline of 240p (called "double strike" mode in purported Nintendo documents) probably emits exactly twice as much light power as each scanline of 480i. The reason it appears less than twice as bright is because as programmers, we're used to "twice as bright" meaning "twice as much voltage". But twice as much voltage produces roughly four times as much power, as power is the square of voltage for a given impedance. (It's not so simple in CRTs, as the gamma is slightly greater than 2.0 for various reasons.) Under this power-law assumption that perceived brightness is the square root of light power, the individual lines are as bright probably closer to 1.4 times as bright.
(I just got Rickrolled by WLDE.)
OK. For all you people, I've found interesting clip, that show basics of CRT TV raster:
http://www.youtube.com/watch?v=lRidfW_l4vsQuote:
This is a popular request for a TV scanning. shot at 10,000 FPS and played back at 24 FPS the TV refreashes @ 29.97 FPS. GOOD NEWS NOW YOU CAN DOWNLOAD MOST OF YOUR FAVORATE ULTRASLO CLIPS ON YOUTUBE. This will help us produce more clips for your viewing pleasure.
Notice: red fades much longer, than green and blue.
As you can see, phosphors are extinguished much quicker, than most of you think. First picture tubes have long afterglow, that slightly smoothed flicker. They has phosphors afterglows almost at half of screen. Short afterglow is need for high refresh rates (FPS) TVs (or CRT monitors).
Conclusion. Consider phosphor as capacitor, that charges short strong pulse of energy. Thus 240p refreshes twice quicker and this capacitor dont have much time to discharge fully. So, with same charge energy it has more energy for fill time gaps between charge pulses, that may be looks like as slightly higher luminance. This is typical and correlates with diagrams that provided by TmEE. PWM effect.
Ah, one more thing. From my an electronics past, that fresh tube main anode current are almost static and limited at 1mA. So, definitely phosphor charge energy impuls are almost static.
This thread is comical, like two people speaking different languages talking past each other.
Sorry to dig up old bones, but I had a discussion with a friend of mine about this.
Does anyone know what part of the NTSC signal determines which "field" you draw? (most of you are calling it scanlines)
I tried to find info about the NTSC spec about the field, but came up short.
I was considering looking through some of the open source AVR projects that generate a 240p signal in software and seeing what they're doing in the code...
EDIT: I think I may have found the answer:
https://sites.google.com/site/h2obsessi ... /InterlaceWhich lead me to this:
https://docs.google.com/a/54.org/viewer ... V_Sync.pdfIs it that simple?
Interlacing is the result of the hsync frequency not being an integer multiple of the vsync frequency. It's also the reason for the equalizing pulses during the vertical blanking interval.
It just depends on the timing of the vblank.
To draw an interlaced picture, you draw 262.5 scanlines, the end (or beginning) of which is vblank. That half-scanline is what creates the interlacing; vblank alternates between the beginning of a scanline and the middle of a scanline, which causes the next frame to alternate between starting at the top-left of the screen, and the top-center.
This is important because scanlines actually slant downward slightly. This is because the electron gun in the television is always moving down, regardless of whether you're in hblank or not; otherwise you'd never reach the bottom of the screen. When you start drawing at the top-center of the screen, the resulting half-scanline is above the scanline you would've been drawing had you started at the top-left. Each subsequent scanline is offset in the same way, until the next vblank.
To draw a non-interlaced picture (240p in other words), you have to either drop the half scanline, or extend it into a full scanline. In other words, always draw 262 or always draw 263 scanlines (again, the end (or beginning) of which is vblank).
Drag wrote:
To draw a non-interlaced picture (240p in other words), you have to either drop the half scanline, or extend it into a full scanline. In other words, always draw 262 or always draw 263 scanlines (again, the end (or beginning) of which is vblank).
So which method does the NES use?
The NES draws exactly 262 scanlines. (Hence why its vertical refresh frequency is 60.1 Hz instead of 59.9 Hz)
In any case, the so-called "240p" signal has timing compatible with the actual 240p RGB signals used by Apple IIGS RGB monitors and standard-resolution arcade monitors.
Yeah, it's actually the monitor's responsibility to deal with the sync signals given by the source. That's why they're there.
As long as there isn't a significant amount of deviation from the standard, any monitor will be able to display the picture. That's not to say there aren't monitors that will handle a super-wrong signal, though.
Given the extent to which the output of a VCR does NOT comply with NTSC, you can be reasonably assured that ALL SDTV CRTs can deal with 240p and 480i properly. (More information, and I found this a very interesting read:
http://www.ronaldsnoeck.com/vcr.htm )
HDTVs with their "motion interpolation" and "upscan conversion" and trying to put lipstick on the pig, well... only comes close to working when the pig complies.