When the Famicom was designed, were mappers an afterthought or were they intended to be part of the system from its inception? I have the sense that SMB pushed the Famicom hardware to the limits and anything more advanced would required extra chips in the cart. Did the engineers recognize this when they built the system?
Masayuki Uemura claims it was not an afterthought: "We designed the hardware from the beginning with the understanding that it would be modular, via the memory mapping chips, so that was intentional from the beginning."
Source:
http://www.usgamer.net/articles/nes-cre ... ole/page-3 (fun article, recommended)
Mappers were common in other platforms before the Famicom was designed, and hardware designers often counted on prices for parts decreasing over time. One proof that the Famicom was designed with expansions in mind is the presence of the PPU bus on the cartridge connector. This allowed them to put less VRAM inside the console, making it cheaper, and as memory prices decreased, putting a video chip inside each cartridge would be less of a problem.
An internet search reveals that 8KB of RAM in 1983 cost around US$15, but by 1986 the same memory had already dropped to US$2.40.
So yeah, you couldn't go all out when designing a video game game console because the added cost of all the parts quickly piled up, but the prices of computer parts always dropped really fast, so it simply made sense to consider hardware expansions, specially in a cartridge based system, where you can give the cartridge direct access to many parts of the system.
@thefox Thanks for the link. It looks exactly like the type of insider story that I love. I'll read this a bit later when I get a chance.
@tokumaru I agree with everything that you wrote. But, I still find the economics behind their design decisions baffling. An alternative expansion plan could have worked kind of like what Apple does today; they could have released a new version of the Famicom every few months that was completely backwards compatible and included the new chips. And, to avoid pissing off their existing customer base, they could have given away completely for free an in-between device (like the Game Genie) that contained the extra chips.
A glut of such "new versions" contributed to the North American video game recession of 1983-1984.
zeroone wrote:
@thefox Thanks for the link. It looks exactly like the type of insider story that I love. I'll read this a bit later when I get a chance.
@tokumaru I agree with everything that you wrote. But, I still find the economics behind their design decisions baffling. An alternative expansion plan could have worked kind of like what Apple does today; they could have released a new version of the Famicom every few months that was completely backwards compatible and included the new chips. And, to avoid pissing off their existing customer base, they could have given away completely for free an in-between device (like the Game Genie) that contained the extra chips.
I can't remember where I read it (or even find the price at launch), but I recall it being said that they wanted the Famicom to be cheap enough to where kids could buy one for themselves. They even removed the controller connectors from the design to make it cheaper. Notice how the cables enter the back, but inside they snake around completely to the front. And it sounded like quite a gamble to begin with, having the custom chips made they had to agree to order a huge amount of them to get the price they wanted. I believe it was said they didn't even really want the 6502 initially, but went with it because it was so much cheaper than the Z80.
Not only was the consumer-level 6502 chip cheaper than the Z80 (not entirely relevant here), but the 6502 silicon die was cheaper to incorporate into what Ricoh were making for Nintendo since the 6502 die itself was like 1/4 of the size of the Z80's. Smaller die ~= higher yield = lower price.
thefox wrote:
Masayuki Uemura claims it was not an afterthought: "We designed the hardware from the beginning with the understanding that it would be modular, via the memory mapping chips, so that was intentional from the beginning."
Source:
http://www.usgamer.net/articles/nes-cre ... ole/page-3 (fun article, recommended)
That reminds me of another recent thread (I can't remember which one) where it was brought up that Nintendo should've gotten rid of the spc700 but keep the s-dsp. I wonder if Nintendo really wanted the spc700 or not, and just didn't have enough time to contact Sony about it, or did Sony just somehow managed to keep the price down anyway that Nintendo didn't mind the extra hardware.
edit: Where is that thread anyway? Was I dreaming it?
I don't think "Nintendo didn't have time" would be correct. The Super Famicom could have come out earlier than it did. I think it got more Work RAM than initially planned thanks to its later release. If they really didn't want the system with the SPC700 in the mix then they could have avoided it. Maybe you are right in the idea that Sony made it cost appealing.
That and contemporary arcade platforms usually had a separate CPU dedicated to sound. Look at many 68000-based arcade PCBs that used a Z80, as well as their home counterparts (Genesis and Neo Geo AES). And just as Donkey Kong 3 and Punch-Out!! had used a 6502 clone (with decimal mode enables disconnected) to run their APU, so too did the Super NES use a 65C02 clone (with rearranged opcodes) to run its APU.
I guess it made sense that it was popular at the time, but I don't know what Sony was thinking having the spc700 manually read things from the CPU, instead of just halting the spc700 for a cycle to let the CPU write to it. I wonder if Sony refused to change that.
Something else I've been wondering about, wouldn't the interpolation be really expensive for the chip? It would have to do 4 multiplies per sample, and a 512x16bit gaussian table.
Pandocs note an unused audio-in from cart on DMG…did Pokémon Yellow really just utilize channel 3?
Myask wrote:
Pandocs note an unused audio-in from cart on DMG…did Pokémon Yellow really just utilize channel 3?
The GBC programming manual claims that the cartridge audio-in is only "normally usable" on the GBC. It sounds like the mixing level on the DMG is messed up so it's impossible to produce an audible volume without drawing more power than the cartridge slot is designed to, or something like that.
I seem to remember that Pocket Music for GBC used it, and that's why that particular game isn't compatible with GBA.
thefox wrote:
Masayuki Uemura claims it was not an afterthought: "We designed the hardware from the beginning with the understanding that it would be modular, via the memory mapping chips, so that was intentional from the beginning."
Source:
http://www.usgamer.net/articles/nes-cre ... ole/page-3 (fun article, recommended)
Thanks for the link. It was a good read and it did answer a lot of things I was wondering about.
Memory mapped cartridge chips were part of the original design. And, the console hardware architects had 2 primary constraints: keep console manufacturing cost below a strict value and make sure it can run an authentic port of Donkey Kong. The article also mentioned that the lack of a scanline interrupt was not a cost cutting decision; they simply did not think of it.
zeroone wrote:
The article also mentioned that the lack of a scanline interrupt was not a cost cutting decision; they simply did not think of it.
People have found a
vestigial cycle counter in early versions of the CPU though, so apparently they did consider offering something more than just the sprite 0 hits for timing.