I have compiled Blargg's NTSC filter demo for Windows. This may be useful for previewing how an image may look as an NTSC screen.
It's a command line tool. It works on test.bmp by default, but you can use an argument filename, or just drag a BMP onto the EXE.
Original source code by Blargg, available here:
http://slack.net/~ant/libs/ntsc.html
Very cool indeed.
Why is the colour depth reduced ? Do you force to use the NES palette ? But then what about if we want to test graphics for another platform ?
There's a different library for each system and each of them take the color depth native to each system (there's also the fact that each system has its own set of artifacts)
Thanks. I was going to compile it myself, but couldn't bother to install SDL.
Pretty cool, I'll have a lot of fun with this.
What would be great is to have this as an option in a tile editor such as Tile Molester or YY-CHR, so that you can preview your BGs or sprites directly with the NTSC fiilter as you draw them.
Ah, I didn't realize the nes_ntsc one reduced colour like that. I have rebuilt with the snes_ntsc version.
Awesome! Thanks for compiling this to make things easier on us artsy people. I'll definitely be giving this a shot with a bunch of my NES assets once I get home later. I'll show you my results.
Bregalad wrote:
What would be great is to have this as an option in a tile editor such as Tile Molester or YY-CHR, so that you can preview your BGs or sprites directly with the NTSC fiilter as you draw them.
Run
my graphics editor in Nestopia with the NTSC filter turned on.
Mmh, sorry, yes I know your editor is impressive and all, but honnestly running it like that is so much less user friendly than a PC program that allows the mouse to be used. Again sorry, but I'd sooner use YY-CHR without NTSC filter than your editor with NTSC filter.
Here's an image for your first post using the map from my M25 project! I decided to choose something detailed to demonstrate that the NTSC filter isn't
too destructive of details. On the left is the unaltered image, the center is FCEUX's filter at 3x zoom and the the right image is Blargg's filter.
Threads are always better with images.
Why'd you squish the NTSC output vertically instead of leaving at its original increased width? The scaling you've applied seems to completely overshadow the NTSC filter.
Well, the output of the NTSC filter was at a completely different aspect ratio than the other two versions so there wasn't much I could do to help that. I mean, I could have stretched it to force a 0.875 aspect ratio but that would have looked even sillier.
Why not leave it in its original aspect ratio?
If I take this 256x224 image (0.875 aspect ratio)...
Then I apply Blargg's filter and don't touch the output at all, I get this 602x448 image (0.744186 aspect ratio)
That's impossible to cleanly resize back down to 256x224 unless I want to stretch the pixels in the process.
Quoting the documentation (nes_ntsc.txt) that comes with blargg's
nes_ntsc code:
Code:
Image Size
----------
For proper aspect ratio, the image generated by the library must be
doubled vertically.
Use the NES_NTSC_OUT_WIDTH() and NES_NTSC_IN_WIDTH() macros to convert
between input and output widths that the blitter uses. For example, if
you are blitting an image 256 pixels wide, use NES_NTSC_OUT_WIDTH( 256 )
to find out how many output pixels are written per row. Another example,
use NES_NTSC_IN_WIDTH( 640 ) to find how many input pixels will fit
within 640 output pixels. The blitter rounds the input width down in
some cases, so the requested width might not be possible. Use
NES_NTSC_IN_WIDTH( NES_NTSC_OUT_WIDTH( in_width ) ) to find what a given
in_width would be rounded down to.
And from nes_ntsc.h -- it looks like these are macros, so someone would need to write the code to dynamically change this on the fly (through a command-line argument or some other means). Comments are worth noting:
Code:
/* Number of output pixels written by blitter for given input width. Width might
be rounded down slightly; use NES_NTSC_IN_WIDTH() on result to find rounded
value. Guaranteed not to round 256 down at all. */
#define NES_NTSC_OUT_WIDTH( in_width ) \
((((in_width) - 1) / nes_ntsc_in_chunk + 1) * nes_ntsc_out_chunk)
/* Number of input pixels that will fit within given output width. Might be
rounded down slightly; use NES_NTSC_OUT_WIDTH() on result to find rounded
value. */
#define NES_NTSC_IN_WIDTH( out_width ) \
(((out_width) / nes_ntsc_out_chunk - 1) * nes_ntsc_in_chunk + 1)
Bregalad wrote:
Mmh, sorry, yes I know your editor is impressive and all, but honnestly running it like that is so much less user friendly than a PC program that allows the mouse to be used.
I could make a version that uses a mouse. I've already made two programs that support a Super NES mouse through a Super NES controller to NES adapter: Thwaite and my sound effects editor. But right now, the only NES emulator I can think of that supports a mouse is Nintendulator, and Quietust refuses to add filters. So first you'd have to either A. solder together such an adapter to run on an actual NES, B. request support for an NTSC filter in an NES emulator that supports the Super NES Mouse, or C. request support for the Super NES Mouse in an NES emulator that supports an NTSC filter. I've
requested support for the Super NES Mouse in FCEUX.
DragonDePlatino wrote:
That's impossible to cleanly resize back down to 256x224 unless I want to stretch the pixels in the process.
What is wrong with 301 x 224? Even changing the aspect ratio with interpolation back to 256 x 224 would still should the NTSC filter's effect pretty well.
What you did was resize it to 256 x 224, but preserving the new aspect ratio with black bars, and apparently without interpolation, leaving us with an image that is very distorted vertically (especially look at all those stripe artifacts across the image). All I'm saying is that because of the resizing process you applied, the example you made didn't really show us what the NTSC filter does very well at all.
Three examples of how it could have been done alternatively:
Attachment:
File comment: Interpolated to 301 x 224. (Aspect preserved.)
just_interpolated.png [ 140.37 KiB | Viewed 5352 times ]
Interpolated to 301 x 224. I think this is best; an interpolation of precisely 50% on both axes leaves a clean result, and preserves the aspect ratio.
Attachment:
File comment: Interpolated to 256 x 191. (Aspect preserved, vertical information loss.)
letterboxed.png [ 118.52 KiB | Viewed 5352 times ]
Interpolated to 256 x 191. Unlike the example you made, the interpolation method used here preserves some of the detail. There is still significant information loss vertically, so this is not entirely clean, but at least you have less artifacts.
Attachment:
File comment: Interpolated to 256 x 224. (Horizontal squish.)
horizontally_interpolated.png [ 135.8 KiB | Viewed 5352 times ]
Interpolated to 256 x 224. This shows fairly well what the NTSC filter does, but ignores the aspect ratio change. There is some loss of detail about what the NTSC filter does horizontally, but it is not as prone to aliasing problems as the vertical loss of detail is, since the NTSC signal is organized in scanlines.
Specifically Lanczos interpolation was used, but other interpolation methods may give similar results. The main thing here is not to use nearest neighbour resampling (no interpolation) when scaling down.
Oh! I almost forgot! Yeah, I was using-nearest neighbor when scaling down...I tend to have that set as my default scaling since I draw with programs like ASEprite. Anyways, I think that last image will work just fine for the thread!
Fair warning: the reason that nes_ntsc produces an image that's stretched horizontally is because the actual hardware does that. Undoing the aspect ratio change is somewhat misleading ... either you have the NTSC artifacts, and the pixel aspect ratio is 8:7, or you use square pixels and there's no NTSC artifacts.
Not to mention: I don't know what all of this is being used for -- the thread so far implies it's being used for map generation -- but when it comes to screenshots, it's my experience that people really don't like seeing blurry crap and would rather see sharp pixels.
I can't tell you how many times I've used GameFAQs maps and so on, only to find people applied stupid filters + weird resizing + other crap and the end result actually hurts my eyes. I actually go find other maps/walkthroughs/etc. in that scenario. So IMO you're better off not bothering and instead just doing a linear ("nearest neighbour") scale of 2x and so on.
If you're doing it for some other reason, i.e. you wanna see if the graphics improvements/changes you're doing would look semi-good with actual NTSC artefacting, then that's a bit more understandable.
Quite a few emus, such as VirtuaNES, has an option for a 8:7 ratio (a ckeckbox called "TV aspect"), and that since long before Blargg made the NTSC filter.
Quote:
linear ("nearest neighbour")
Linear and nearest neighbour are 2 completely different things ! From the context it sounds like it's nearest neighbour interpolation you're talking about, NOT linear, since linear interpolation WILL blur the pixels together.
Well, the reason I was originally interested in this is because I needed to test my sprites for clarity. In the past, I've had a lot of people complain that my sprites don't have enough contrast or they're too detailed even if I think they're fine. So as a good benchmark, I like to view them with an NTSC filter to see if I can still recognize them. If I can't, I usually go back and change body proportions, remove details, etc.
Anyways, regardless of the aspect ratio, this NTSC filter works great for my purposes! Thanks again for compiling this for the less programming-adept such as I.
I'd say that the 602×448 image probably shouldn't have been downscaled at all in the first place... I mean, artifacts vary at the subpixel level, so the higher resolution is actually needed for an accurate representation. Not to mention that let's face it, it's not a large image for today's resolutions (heck, for the current higher resolutions 256×224 is actually too small).
Bregalad wrote:
Quote:
linear ("nearest neighbour")
Linear and nearest neighbour are 2 completely different things ! From the context it sounds like it's nearest neighbour interpolation you're talking about, NOT linear, since linear interpolation WILL blur the pixels together.
I guess I'm using the wrong terms then, sorry. I know bilinear and trilinear interpolation will result in horrific blur, but was under the impression the term "linear interpolation" (no bi- or tri- prefixes) meant the same thing as nearest-neighbour. Good to know I was wrong + learned something.
koitsu wrote:
I guess I'm using the wrong terms then, sorry. I know bilinear and trilinear interpolation will result in horrific blur, but was under the impression the term "linear interpolation" (no bi- or tri- prefixes) meant the same thing as nearest-neighbour. Good to know I was wrong + learned something.
Bilinear and trilinear are both linear interpolation, just in 2 and 3 dimensions, respectively. For example:
- Linear interpolation is done on audio.
- Bilinear interpolation is done on images.
- Trilinear interpolation is done on a volume, often a stack of images like in mip-mapping.
The prefix may be omitted if the number of dimensions is implictly known, or otherwise unimportant.
Unfortunately this scheme of prefixes does not hold up consistently in other cases. For instance the terms "biquadratic filter" and "binormal" don't refer to 2-dimensional analogs of a "quadratic filter" or "normal".
koitsu wrote:
I guess I'm using the wrong terms then, sorry. I know bilinear and trilinear interpolation will result in horrific blur, but was under the impression the term "linear interpolation" (no bi- or tri- prefixes) meant the same thing as nearest-neighbour. Good to know I was wrong + learned something.
Bilinear refers to linear interpolation in both the X and Y axes. Trilinear means doing that and also linear interpolation between mipmaps (a third "axis"), although I suppose it also applies to the Z axis when it comes to 3D textures.
Simple linear interpolation would be interpolating in one axis and not the others =P I'm not even sure if that's supported by the hardware (OpenGL doesn't support it).
If you do bilinear interpolation between two images of the same height, you get linear interpolation in the horizontal direction and nothing in the vertical. Complications may arise if an API is unclear about whether a coordinate of 0 applies to a texel center or a texel corner.
That's not unclear at all:
In OpenGL, 0 is a pixel corner.
In Direct3D <= 9, 0 is a pixel centre (i.e. the wrong way to do it).
In Direct3D >= 10, 0 is configurable as either but defaults to the pixel corner.
Could you post the compiled ntsc_nes demo as well? I believe it has some benefits over the more generic ntsc_snes library when used on NES images.
Hrm, I seem to have deleted the project. What's the benefit of the NES version? I thought the only difference was that it remaps the image to a particular palette.
Forcing people to not use known-wrong palettes for NES mockups has some merits...
How would you identify a "known wrong" palette, when it isn't even possible to have a definitive "correct" palette?
Anyhow, sorry if I am being argumentative here, I'd be happy to redo the process and build the NES version sooner or later, but I want to know that there is some value in it before I do. (i.e. what is the difference, and why is it useful)
rainwarrior wrote:
Anyhow, sorry if I am being argumentative here, I'd be happy to redo the process and build the NES version sooner or later, but I want to know that there is some value in it before I do. (i.e. what is the difference, and why is it useful)
I was hoping you'd have the compiled version somewhere. If you don't, no need to compile it for my sake (I don't have any immediate use for it).
I don't actually know if there are any visible functional differences between the libraries beyond the lack of conversion. I was just assuming so because nes_ntsc is distributed as a separate library. But maybe that's only for optimization/legacy reasons.
I just use the SNES version after remapping everything to a Bisqwit or Drag palette anyway.
Thank you very much.
Can I drag any picture to this program?
I mean I want drag the photo taken by camera to this program.
Could you please release a Mac Version?
I really need that.
Thank you very much.