I've implemented this in Nintendulator, though I have no idea if it is actually accurate:
1. Given each palette entry, construct the chroma waveform (in an array of 12 floats or doubles).
2. Given the desired emphasis bits, attenuate the chroma signal.
3. Find the phase offset of your chroma signal. In my emulator, I used a function to find it within 1/5th of a degree in 3 passes (once within 30 degrees, another within 2.5 degrees, and finally within 0.208333 degrees) by comparing the chroma signal to a sine wave (with the same amplitude and offset) at 12 different phases and focusing on the one which resulted in the least absolute error.
4. Find the amplitude of your chroma signal. In my emulator, I simply subtracted the DC offset of the signal (the mean of all 12 values, also used as the luminance) and used its quadratic mean (otherwise known as RMS).
5. Feed your hue, saturation, and luminance into a YIQ->RGB converter [Y = luminance, I = saturation * sin(hue), Q = saturation * cos(hue)] and collect the results.
The results look fairly close to what my composite monitor and TV tuner display when running my
Color Bars v2 test program on my CopyNES.