Before I starting writing an app from scratch, are there any out there that will convert bitmap lines for unique 16color updates? I believe the Atari ST and Apple II GS did stuff like this to display static images with more colors. Any PC apps that will do the trick (quantize each line to a unique set of 16 colors)?
It's something you could do real simply with a command line and the netpbm tools ... something like
Code:
for y in `seq 0 (image height)`; do pngtopnm (file) | pamcut -top $y -height 1 | ppmquant 16 > $y.ppm; done
Which target platform is the image for?
There might be tools specific to the game boy color already, and that had a 32 color limit per scanline.
The target system is 9bit RGB, PCE or MD. I wanted to do something simple, like full 16 colors per scanline change, to see how optimal the results would be.
Never heard of netpbm. I'll have to look into that.
Just for reference, I did this: (
wikimedia:RGB_24bits_palette_sample_image.jpg)
Code:
djpeg RGB_24bits_palette_sample_image.jpg | pnmdepth 7 > RGB_9bits_palette_sample_image.ppm
for y in `seq 0 199`; do pamcut -top $y -height 1 RGB_9bits_palette_sample_image.ppm | ppmquant 16 > $y.ppm; done
pnmcat -tb ?.ppm ??.ppm 1??.ppm | pnmtopng > RGB_4_of_9bits_palette_sample_image.png
and got this image:
Attachment:
RGB_4_of_9bits_palette_sample_image.png [ 11.39 KiB | Viewed 3063 times ]
You should get better results by using dithering and moving the bitdepth reduction later, but this was the simplest thing I could think of that would work.
The horizontal line artifacts are what I was afraid of. I don't think this approach alone is going to be an acceptable method (an automated process). The color count is high, 144 colors, but it doesn't really give that perception of color fidelity in detail. I was expecting a little more out of that technique.
I have sixteen 16color subpalettes to work with. I guess I really don't have any choice but to use 15 of them to represent some of the simpler scene areas, losing some details in the process, and use the spare in line updates for the more complex and dense areas.
As I said, there's two conspicuous shortcuts I took:
1- I reduced bitdepth to 9bpp instead of 24bpp at the very beginning, meaning that the error signal is awful
2- Although it selected a new 16-color palette for each scanline, it converted each pixel in isolation.
Diffusing the error up &// down should help with the band-iness.