The NES is brought up numerous times throughout the video (especially at the very end of part 5, and the very beginning of part 6).
One thing I really want to stress here is that the video (part 3 or 4, I forget) actually brings up the Zophar's Domain web site and goes through all of the NES emulators listed, noting it's quite likely that every one of those authors had to write their own 6502 core -- and chances are every one of them is buggy in some way (given that there are always edge cases emulator authors don't know of or don't implement). I think this forum pretty much proves that true.
I indirectly tried to make this point by creating
the NES emulator list on the nesdev Wiki (particularly the "Under Development" ones), and have stated a couple times here on the forum that people need to stop writing their own 6502 core and instead just use an established core that is known to work reliably and accurately. Sorry to inject a rant/jab at folks, but it's really sad seeing everyone and their dog making an emulator when it makes more sense to improve or refine to perfection something that already exists[1].
With the introduction of visual6502.org, I'm hoping that will come to fruition.
The presentation mentioned above also talks about the process that was used by the visual6502.org guys to recreate the processor on a transistor, bus, and trace level -- and how the exact same technique could be used for other chips. It would be really wonderful to have someone dedicated to working with these guys to accomplish the same thing but for the NES PPU.
<somewhat-off-topic>
[1]: I acknowledge that emulating a processor is an educational experience for someone wanting to learn how a CPU works, but um, it makes a lot more sense to start programming *on/for* the CPU. I can't begin to count how many people show up here who have absolutely no knowledge of microprocessor architecture or what an opcode is, yet they want to make an emulator. I don't know why this irks me so much (probably some OCD thing), but it does. It's like people are going about learning how something works in a completely backwards manner. Summarised version of this paragraph: my opinion is that if you want to learn how the NES "works", why not start programming a game or something on it first? For example, if I was to want to learn how the Amiga worked, I sure as hell wouldn't start by writing an Amiga emulator -- I'd start by getting my hands on an original Motorola 68000 assembly reference manual and learn 68K. I'd familiarise myself with the system and processor first, not jump right in and try to emulate the thing. I don't like 68K anyway. ;-)
Footnote/story: How did *I* learn 65xxx assembly? I started with the original Apple II+ and then moved to the Apple IIE+ doing strictly Applesoft BASIC while in school (we had a IIGS but only one and it was mainly reserved for the faculty). As we started to get more and more into using graphics and lots of mathematical operations, I found that my programs were gradually slower and slower. After class one day I asked my teacher how I could make them run faster, and he literally tossed me a book on 6502/65c02 machine language and introduced me to the System Monitor on the Apple II (CALL -151). I started doing machine language first, and after I had reached my wits end with how tedious it was, asked the same teacher if there was a better way to write such code. He pointed me to using assembly language instead of machine language and I was permitted to use his copy of Merlin 8 while at school. Within a year or so I pretty much did all of my coding in 65c02, and then 65816 (once I got my own personal IIGS at home). The rest is history. Today I mainly do C and perl, but literally everything I design/code has a "bare-bones" or KISS mentality applied to it, and that all stems from working with the 65xxx series on 1MHz processors.
Have I ever written an emulator? Sure, I've helped on a couple (qNES and VeNES). But I can't imagine starting to learn how a computer (or processor) works by writing an emulator without any prior knowledge of how a microprocessor works. That's the point I'm trying to get across.
</somewhat-off-topic>