Hi, I'm making a HD pack for a game with HDNes and that game frequently fades the screen to black, so I'm adding the ability to specify a HD replacement tile with a "brightness %" value and when the emulator renders that tile, it will multiply the RGB values with the brightness to get the final color. This way, I don't need to make copies of the same image with different brightness.
Now if the emulator can detect the darkening of a palette and a HD replacement tile is already provided for that tile with the "brightest" palette, then the emulator can automatically assign a brightness value and render the tile with it. This way, little work is needed to handle fade in or fade out of the screen.
So if I have a table of a color and a list of colors as darkened variations with their relative brightness values, then I can check to see if there is a "brightest" palette specified for that tile and all three colors are the darkened variations of the respective colors in the "brightest" palette. If the answer is yes, then I can use the average of the relative brightness values as the brightness value of the darkened palette. Since fade in or fade out are usually quick, so I don't need to be very accurate. Colors with the same second byte are obvious, but others are more subjective.
Do you think my idea will work? Any suggestions on how to build the look up table or comment on the whole idea? Is it possible to extend this to other types of palette change?
Thanks.
Now if the emulator can detect the darkening of a palette and a HD replacement tile is already provided for that tile with the "brightest" palette, then the emulator can automatically assign a brightness value and render the tile with it. This way, little work is needed to handle fade in or fade out of the screen.
So if I have a table of a color and a list of colors as darkened variations with their relative brightness values, then I can check to see if there is a "brightest" palette specified for that tile and all three colors are the darkened variations of the respective colors in the "brightest" palette. If the answer is yes, then I can use the average of the relative brightness values as the brightness value of the darkened palette. Since fade in or fade out are usually quick, so I don't need to be very accurate. Colors with the same second byte are obvious, but others are more subjective.
Do you think my idea will work? Any suggestions on how to build the look up table or comment on the whole idea? Is it possible to extend this to other types of palette change?
Thanks.