I have started working on my own NES emulator, and after finishing most, if not all, of my CPU core I have started testing it with the nestest but I found a weird error at the following instructions.
Upon further inspection I see that my emulator have the correct values: A = 0, $01 = 0xFF and P = 0x2E. According to the nestest log P = 0xEE after executing BIT, but how is that possible when A = 0 and $01 = 0xFF? 0x00 & 0xFF = 0x00, which means that both the N and V flags should be false at this point. Following the same logic, I would assume that P = 0x2E after executing BIT.
Am I missing something, or is there some error with the nestest log?
Code:
C8B9 18 CLC A:00 X:00 Y:00 P:2F SP:FB
C8BA 24 01 BIT $01 = FF A:00 X:00 Y:00 P:2E SP:FB
C8BC A9 55 LDA #$55 A:00 X:00 Y:00 P:EE SP:FB
C8BA 24 01 BIT $01 = FF A:00 X:00 Y:00 P:2E SP:FB
C8BC A9 55 LDA #$55 A:00 X:00 Y:00 P:EE SP:FB
Upon further inspection I see that my emulator have the correct values: A = 0, $01 = 0xFF and P = 0x2E. According to the nestest log P = 0xEE after executing BIT, but how is that possible when A = 0 and $01 = 0xFF? 0x00 & 0xFF = 0x00, which means that both the N and V flags should be false at this point. Following the same logic, I would assume that P = 0x2E after executing BIT.
Am I missing something, or is there some error with the nestest log?