Difference between 6502 and 2A03 CPU core

Difference between 6502 and 2A03 CPU core
by on (#107780)
Although Quietust already made some explorations, I decided to do my own.

After detailed study of 2A03 circuit following results were obtained:
- No differences were found in the instruction decoder
- Flag D works as expected, it can be set or reset by CLD/SED instructions; it is used in the normal way during interrupt processing (saved on stack) and after execution of PHP/PLP, RTI instructions.
- Random logic, responsible for generating the two control lines DAA (decimal addition adjust) and DSA (decimal subtraction adjust) works normally.

The difference lies in the fact that the control lines DAA and DSA, which enable decimal correction, are disconnected from the circuit, by cutting 5 pieces of polysilicon (see picture). Polysilicon marked as purple, missing pieces marked as cyan.

As result decimal carry circuit and decimal-correction adders do not work.
Therefore, the embedded processor of 2A03 always considers add/sub operands as binary numbers, even if the D flag is set.

Image
(clickable)

PSD source : http://breaknes.com/files/APU/core.zip [155 MB]
Podcast (russian) : http://youtu.be/Gmi1DgysGR0
6502 schematics : http://breaknes.com/files/6502/6502.jpg
Re: Difference between 6502 and 2A03 CPU core
by on (#107783)
Are you saying that the circuit is there, it's just not connected? That's pretty interesting... They simply had to make the CPU different (for legal reasons?) so they just "broke" a certain feature, instead of not implementing it at all... so weird!
Re: Difference between 6502 and 2A03 CPU core
by on (#107785)
Yup, Nintendo "cracked" 6502 to avoid patent payments.
Here is patent : http://www.google.com/patents/US3991307
"Integrated circuit microprocessor with parallel binary adder having on-the correction to provide decimal results"
So they need to cut only decimal correction.
Re: Difference between 6502 and 2A03 CPU core
by on (#107787)
And remember that at the time, only patents covered microprocessors. Copyright-like exclusive rights in integrated circuit topographies don't apply to ICs first sold before about 1990. Perhaps this is why the Super NES's CPU includes an authentic second-source 65816 core.
Re: Difference between 6502 and 2A03 CPU core
by on (#107905)
This actually makes full sense. Placing transistors on a die is actually a difficult and complex and painful job. Today for digital circuits this can be automated, but back in the '80s I'm not sure it could. It makes sense they would use a working die and just remove a few connections instead of having to actually re-do a 6502 without the decimal mode.
Re: Difference between 6502 and 2A03 CPU core
by on (#107909)
Sneaky. So Commodore's engineers were correct:

Quote:
[Commodore 64 programmer] Robert Russell investigated the NES, along with one of the original 6502 engineers, Will Mathis. “I remember we had the chip designer of the 6502,” recalls Russell. “He scraped the [NES] chip down to the die and took pictures.”

The excavation amazed Russell. “The Nintendo core processor was a 6502 designed with the patented technology scraped off,” says Russell. “We actually skimmed off the top of the chip inside of it to see what it was, and it was exactly a 6502. We looked at where we had the patents and they had gone in and deleted the circuitry where our patents were.”


Quoted from Bagnall's On the Edge book.
Re: Difference between 6502 and 2A03 CPU core
by on (#107922)
So how useful would decimal mode have really been?
Re: Difference between 6502 and 2A03 CPU core
by on (#107923)
Dwedit wrote:
So how useful would decimal mode have really been?

It could have made scores and other stats easier to manage... Can't think of anything else.

Since I learned assembly with the 2A03, I don't really miss the decimal mode. In games you might need an occasional BIN to DEC conversion, or addition and subtraction of decimal numbers, but those are things you can code routines for just once (or even use someone else's routines) and never think about this again.
Re: Difference between 6502 and 2A03 CPU core
by on (#107927)
Bregalad wrote:
This actually makes full sense. Placing transistors on a die is actually a difficult and complex and painful job. Today for digital circuits this can be automated, but back in the '80s I'm not sure it could. It makes sense they would use a working die and just remove a few connections instead of having to actually re-do a 6502 without the decimal mode.

These days a lot is automated but the result is far from optimal and still needs human intervention to fix the worst offenders - it just avoids most of the work. It still takes a lot of effort to get done, especially with the complexity of current chips.
Re: Difference between 6502 and 2A03 CPU core
by on (#107942)
tokumaru wrote:
In games you might need an occasional BIN to DEC conversion, or addition and subtraction of decimal numbers, but those are things you can code routines for just once (or even use someone else's routines) and never think about this again.

But they still have to be fast enough. ARMv4 (e.g. ARM7TDMI) doesn't have decimal mode or hardware divide. Someone on the gbadev board used to complain that the sprintf() call to convert binary numbers to decimal to draw the status bar every frame ate up a substantial portion of the available CPU time. And if you're storing both the binary version for calculation and the decimal version for display, why not just operate on the decimal version? That's what a lot of Atari 2600 game programmers tended to do, I'm told.
Re: Difference between 6502 and 2A03 CPU core
by on (#107944)
IMO, that's just bad programming then. Keep a x-digit RAM piece, null terminated, and have all points stored in a array where every digit is a byte. It's not that hard to fix, even in C.
Re: Difference between 6502 and 2A03 CPU core
by on (#107951)
3gengames wrote:
tepples wrote:
if you're storing both the binary version for calculation and the decimal version for display, why not just operate on the decimal version?

Keep a x-digit RAM piece, null terminated, and have all points stored in a array where every digit is a byte.

Fans of decimal mode might have called that a waste of memory.
Re: Difference between 6502 and 2A03 CPU core
by on (#107953)
I imagine the most effective use case for this is an accounting program where you are keeping track of a lot of numbers onscreen, and you want to keep the UI responsive.

Of course, there's the additional overhead when multiplying BCD, which might throw a wrench into that goal...


Anyhow, it's convenient to have. It's better than having to write extra software routines to do the same thing, but as has been pointed out, those aren't that hard to drop into your program anyway, so the benefit is pretty minimal. If the NES had it, it would have been used.
Re: Difference between 6502 and 2A03 CPU core
by on (#183343)
org wrote:
Yup, Nintendo "cracked" 6502 to avoid patent payments.
Here is patent : http://www.google.com/patents/US3991307
"Integrated circuit microprocessor with parallel binary adder having on-the correction to provide decimal results"
So they need to cut only decimal correction.


Even after excising the decimal mode circuitry, what about the rest of it? Why didn't they have to pay royalties for using the "integrated circuit microprocessor" modules?
Re: Difference between 6502 and 2A03 CPU core
by on (#183344)
In 1983 there was no mopyright.

The Famicom was made in the early 1980s, when copyright-like exclusive rights in mask works didn't exist yet. Until then, integrated circuit layouts were seen as too "utilitarian" to qualify for ordinary copyright. But by the release of the Super Famicom, the Treaty on Intellectual Property in Respect of Integrated Circuits (IPIC) of 1989 had been signed. So Nintendo licensed the 65816 from WDC.
Re: Difference between 6502 and 2A03 CPU core
by on (#183349)
tepples wrote:
In 1983 there was no mopyright.

The Famicom was made in the early 1980s, when copyright-like exclusive rights in mask works didn't exist yet. Until then, integrated circuit layouts were seen as too "utilitarian" to qualify for ordinary copyright. But by the release of the Super Famicom, the Treaty on Intellectual Property in Respect of Integrated Circuits (IPIC) of 1989 had been signed. So Nintendo licensed the 65816 from WDC.


But, it was patented. So, does that mean that the patent does not describe the 6502 instruction set behaviors? Or, were the modules covered by separate patents and they only wanted to license some of them?
Re: Difference between 6502 and 2A03 CPU core
by on (#183351)
Patents are available only for novel and useful inventions. As far as I'm aware, two things were novel about the 6502. One was MOS's method to patch defective masks, which increased yield enough to make the 6502 profitable even at a list price of $25. The mask patching presumably didn't apply to Ricoh's fab. The other was decimal mode.
Re: Difference between 6502 and 2A03 CPU core
by on (#183352)
tepples wrote:
Patents are available only for novel and useful inventions. As far as I'm aware, two things were novel about the 6502. One was MOS's method to patch defective masks, which increased yield enough to make the 6502 profitable even at a list price of $25. The mask patching presumably didn't apply to Ricoh's fab. The other was decimal mode.


I thought that Ricoh licensed the 6502 from Coleco. Did Nintendo/Ricoh completely avoid paying royalties by doing this?
Re: Difference between 6502 and 2A03 CPU core
by on (#183354)
MOS was part of Commodore, not Coleco.

For what the patented parts were, see On the Edge excerpt from the previous page.
Re: Difference between 6502 and 2A03 CPU core
by on (#183355)
tepples wrote:
I remember reading (I don't know where) about {someone} REing the 2A03 and being surprised to find that it was a 6502 with the patented parts filed off.

Good memory! It was a page ago. ;-D