Just curious...if a DMC sample is playing, and the bank it sits in is switched for a few hundred cycles and switched back...what would likely happen? Noise for 1/100th of a second? Would the sample stop?
I'm trying out an AxROM style mapper, and I think it would be a waste of space if I had to put the DMC samples on every bank.
DMC fetches a new byte every 400 (PAL, maximum frequency) to 3424 (NTSC, maximum period) CPU cycles.
If the wrong bank is mapped when it fetches its byte, it'll just fetch the wrong byte.
Usually the use pattern means the wrong bytes are being fetched at a regular interval (i.e. 60 Hz), which will likely manifest as a bit of a buzz at the interval frequency. The strength and timbre of this buzz would vary with how much of the interval is spent with the wrong samples playing.
Note also that the DPCM is only going to fetch a new byte every 400 cycles or so at worst, so if the time spent in the wrong bank is relatively small, you might not even catch a bad byte every frame.
Also, receiving "noise" DPCM is not nearly as strong an effect as it would be if it was PCM. The counter only gets a +2 or -2 per sample, so the relative amount of distortion you're getting for a single bad byte is fairly low.
I figured as much, but I hadn't thought about the 60 hz buzz. Thanks.
You might even find a use for the random noise that aries, or playing back code as DPCM. Unrolled PPU update code can usually produce some kind of limited wavetable-like tone.