Hey experts, can I get your take on this idea?
I have used nsf2midi and its English translation before, and would like to replicate the behavior in javascript for a project I am working on. If my amateur understanding is correct, this will mean emulating the NES processor, rendering the nsf file through that emulator, and capturing the resultant channel data as appropriate midi messages. My guess is that this means each NES channel would map to a MIDI channel, so 5 channels in total: SQ1, SQ2, SAW, WN, DPCM (is my understanding correct here? should there instead be channels for each voicing of the NES channels? Are the NTSC/PAL/FDS variations a consideration? What about the audio expansion chips, like VRC VI, N163, and MMC5?). I recognize that this process will not produce high quality MIDIs, and would be ecstatic with performance similar to nsf2midi.
Because writing an entire emulator is beyond me (for now! if need be, I'm willing to learn), it seems that the best approach to this problem is to hack an existing emulator to suit my purposes, then cross-compile to javascript using Emscripten, and pray that everything works out ok. This is the same approach used by 'pure javascript-nsf player', which cross-compiles gme and writes a simple html5 player to render the resultant waveform. Do you think this is the right way to approach the problem? Should I instead have a go at hacking a native javascript NES emulator, like Ben Firsh's JSNES?
Any insight/explanations as to why I am stupid and this will never work would also be appreciated, because you guys know much much more about this stuff than I do. Thanks!
tl;dr: I am trying to convert NSF files to MIDI using client-side javascript, have a few (bad?) ideas about how to do it and want YOUR advice.
I have used nsf2midi and its English translation before, and would like to replicate the behavior in javascript for a project I am working on. If my amateur understanding is correct, this will mean emulating the NES processor, rendering the nsf file through that emulator, and capturing the resultant channel data as appropriate midi messages. My guess is that this means each NES channel would map to a MIDI channel, so 5 channels in total: SQ1, SQ2, SAW, WN, DPCM (is my understanding correct here? should there instead be channels for each voicing of the NES channels? Are the NTSC/PAL/FDS variations a consideration? What about the audio expansion chips, like VRC VI, N163, and MMC5?). I recognize that this process will not produce high quality MIDIs, and would be ecstatic with performance similar to nsf2midi.
Because writing an entire emulator is beyond me (for now! if need be, I'm willing to learn), it seems that the best approach to this problem is to hack an existing emulator to suit my purposes, then cross-compile to javascript using Emscripten, and pray that everything works out ok. This is the same approach used by 'pure javascript-nsf player', which cross-compiles gme and writes a simple html5 player to render the resultant waveform. Do you think this is the right way to approach the problem? Should I instead have a go at hacking a native javascript NES emulator, like Ben Firsh's JSNES?
Any insight/explanations as to why I am stupid and this will never work would also be appreciated, because you guys know much much more about this stuff than I do. Thanks!
tl;dr: I am trying to convert NSF files to MIDI using client-side javascript, have a few (bad?) ideas about how to do it and want YOUR advice.