rainwarrior wrote:
For simple upscaling, though, a lot of programs nowadays would offload this work to a graphics card instead of doing it in software. Maybe you should investigate that route instead?
The problem with this method, at least with DirectX and friends, is that the "memory" (I think in DX terms it's the memory associated with a surface?) used is often on the graphics card natively, which results in stupid things like interpolation or anti-aliasing or whatever -- the visual results look blurry as fuck. Crazy blind fools think "it looks smoother" but they're crazy blind fools (it looks like shit: period).
I cannot STAND emulators which do this. But I also am a serious hater of things like 2xSai and all that other garbage; I want the "pixellated look" to remain that way (linear scaling) if I scale 2x, 4x, 8x, etc...
As I understand it, there are ways around this with DirectX, where you (somehow) tell the thing to use system-based surface memory instead, and for linear scaling you get non-blurry results. I imagine there are probably ways to do this in Direct3D while still using GPU-native memory. In fact, I know there is...
One such example was versions of Nestopia, where to get that "linear pixellation" look, you had to go into Options / Video and change Memory Pool from Video to System. However, the version of Nestopia I use today (the unofficial version that some dudes maintain somewhere, where EmuCR posts SVN/GIT builds of it whenever something changes), allows me to use Options / Video / Memory Pool: Video and still get linear scaling that looks crisp/sharp. I just go to View / Screen Size / 2x and it looks good. It didn't used to be this way though, so someone somewhere improved something.
Attached is an example; speaks for itself. (Sorry, had to use VirtuaNES to get the blurry look (but normally I check Option / Graphics / SystemMemory Surface to ensure this doesn't happen)
The downside, as I understand it, is that using a system memory surface is slower/takes up more time than using native video memory (but the level of impact varies per system's hardware and operating system; if someone's using a PCI video card on a circa-2004 motherboard then the impact is probably going to be quite high compared to, say, a PCIe video card on a circa-2009 motherboard). But as indicated (in Nestopia), there
is a way to get linear scaling looking correct when using native video memory. I just don't speak DirectX or Direct3D to know how to do it.
And no I don't test full-screen mode of anything.
And please don't forget about us XP users, by the way. Don't go the "screw it, I'll use DirectX 11 exclusively" route, unless you also plan on implementing a version using DX9, or even GDI.