Instruments vs. effects?

This is an archive of a topic from NESdev BBS, taken in mid-October 2019 before a server upgrade.
View original topic
Instruments vs. effects?
by on (#169672)
Guys, am I right to assume that in a sound engine with support for instruments there's not much need for effects?

I've looked through a few Famitracker soundtracks and some (simpler sounding) songs don't use any instruments, only effects, and others barely use any effects, relying mostly on instruments. Is there any real reason to mix both? Can you give me any examples?

I can see how having volume or pitch modifiers you can apply to each note can be useful, but other than that I don't see how having 20+ effects could make a significant difference in how a song sounds or how efficiently it's represented, specially considering how much more complex the engine has to be in order to support these effects.

This is my impression as a coder, since I'm not much of a musician, which is why I'm looking for more educated opinions. :mrgreen:
Re: Instruments vs. effects?
by on (#169673)
tokumaru wrote:
Guys, am I right to assume that in a sound engine with support for instruments there's not much need for effects?

It depends on how versatile the instruments are and how many are allowed in one project.

Quote:
I've looked through a few Famitracker soundtracks and some (simpler sounding) songs don't use any instruments, only effects, and others barely use any effects, relying mostly on instruments. Is there any real reason to mix both? Can you give me any examples?

Vibrato can be encoded as an envelope in an instrument, but portamento can't because it's different for every note. Much of the soundtrack of Haunted: Halloween '85 has volume controlled by instruments but pitch controlled by effects, especially portamento. I haven't had time to make NSF, but an audio recording is attached to this post.

Even so, you might want, say, "loud sax" and "quiet sax" without having to double up the instrument definitions.
Re: Instruments vs. effects?
by on (#169676)
In my game I only implemented instruments, more or less. I thought it was enough, you can listen to decide: https://rainwarrior.bandcamp.com/album/my-lizard-is-the-lizard-of-soundtrack

It's this subset of Famitracker features:

1. Instruments with volume, duty, pitch, arpeggio macros.
2. Volume column.
3. Effect F0X to control tempo ("speed" only, no BPM).
4. Effect BXX to create a loop.
5. Effect D00 to change the length of a pattern.
6. No DPCM.


Of course, if there was something else I really wanted I would have added it. Things like portamento aren't really doable as instrument macros, for example, but I simply used the pitch macros in ways that work around this, e.g. I have "bend in" and "fall off" pitch macros that don't target definite pitches but still give a similar feel to how I might use portamento in places.

So, I mean, I totally think it's enough, but I also have a programmer's perspective. I think any decent composer can deal with what you give them, really. I mean, if they can deal with NES only having 3 melodic channels, they can deal with a few more limitations.

The more common problem, I think, is feedback to the composer about the limitations and the data sizes they're producing. You should give them tools before they start composing that can validate that they're not using forbidden effects, preview the NSF, and show them how much data what they're doing produces, so they can learn to write "small". There's probably a lot of tips you can give them on the latter, too.
Re: Instruments vs. effects?
by on (#169687)
Famitracker is just a particular sound engine. Other sound engine will have their own terminology to describe what FT calls "instrument" or "effect", or will use something that has no direct equivalent to FT.

Think of it like a scroll engine or a sprite engine - there's no "one way" to do it.
Re: Instruments vs. effects?
by on (#169691)
Bit of a side note, but i've found famitracker terminology to be a little cryptic. I solder my own synths and in the synth community, we call what FT calls 'effect' a control change message, if digital (usually a MIDI CC), or voltage controlled parameter, if analogue. The term 'instrument' could perhaps be understood as a 'patch'. In fact, the track could be seen as self-patching and that would make sense, at least from my point of view.

I'm still getting to know the nes architecture, but it is my so far impression that FT wasn't primarily designed with efficient hardware playback in mind. I think the GUI has some considerable work flow related flaws, too, compared to trackers like jeskola buzz (not counting the modular part).

Quote:
volume controlled by instruments but pitch controlled by effects


This is what i do, too. It wasn't a conscious choice, it just felt most convenient, i guess.
Re: Instruments vs. effects?
by on (#169700)
WheelInventor wrote:
in the synth community, we call what FT calls 'effect' a control change message, if digital (usually a MIDI CC), or voltage controlled parameter, if analogue. The term 'instrument' could perhaps be understood as a 'patch'. In fact, the track could be seen as self-patching and that would make sense, at least from my point of view.

The term "effect" is the way it was used in most trackers since they started back on the Amiga. The term "instrument" is similarly common among trackers, I think starting with FastTracker and Impulse Tracker. These really are the standard terms.

The terms you're offering come from MIDI history, kind of a completely different etymology (e.g. "patch" originally referring to modular synthesis setup). Trackers that combine stuff from the MIDI world, VST, etc. might end up conflating these sets of terms a bit, but most trackers use normal tracker terminology.

WheelInventor wrote:
it is my so far impression that FT wasn't primarily designed with efficient hardware playback in mind

By "efficient hardware playback" I presume you mean suitable for use in a game, i.e. low CPU and RAM use.

Modules made in FT can be very well suited to efficient playback, but it's not limited to that. In particular being able to layer a bunch of effects simultaneously lowers performance and increases the required RAM use, but if you restrict yourself to a subset (like tokumaru is asking in this thread) it's actually very good for use in a game, particularly if you export to a smaller engine like Famitone, etc.

The NSF engine it provides isn't optimized for size or RAM use (it's designed to be flexible and useful, while being perfectly playable on hardware), which is why it's better to use your own, but the techniques and organization it uses for data are sensible, versatile, and relatively compact; very good for game use. If your performance/RAM budget allows for it, FT's NSF engine is really not all that bad, either, and it can be used directly in a game under the right circumstances. I think the engine size is about 5k and might peak at 5000 cycles or so; could be better, but isn't terrible. (A more lightweight engine might be 2k and 2000 cycles, for rough comparison.)


I've tried a ton of different chiptune trackers, and I'd actually say that Famitracker is the one that is the most suitable for practical use like this. What are you comparing it to? Similarly with regard to "usability", there's no other chiptune tracker that I think is easier to use, or more transparently related to the hardware implementation.
Re: Instruments vs. effects?
by on (#169744)
tepples wrote:
Much of the soundtrack of Haunted: Halloween '85 has volume controlled by instruments but pitch controlled by effects, especially portamento.

But in this project you were merely the coder, and had to abide by the musician's requirements, right? Would you have done anything differently if this was your own project?

Quote:
an audio recording is attached to this post.

Sounds good.

Quote:
Even so, you might want, say, "loud sax" and "quiet sax" without having to double up the instrument definitions.

This is what the volume column is for, right?

rainwarrior wrote:
I thought it was enough, you can listen to decide: https://rainwarrior.bandcamp.com/album/my-lizard-is-the-lizard-of-soundtrack

Sounds complex enough to me.

Quote:
1. Instruments with volume, duty, pitch, arpeggio macros.

Couldn't effects be implemented by applying another envelope on top of the basic instrument? That should allow you to modify some parameters without having to define entirely new (and mostly redundant) instruments... For example, I could create a few sequences for moving the pitch up or down a few notes and apply those on top of any instrument a sound channel may already be using, in order to have portamento with any instrument. That sounds versatile enough, even if not as compact as traditional effects.

Quote:
The more common problem, I think, is feedback to the composer about the limitations and the data sizes they're producing.

Did you have to go through this or did you compose the entire Lizard soundtrack yourself?

Bregalad wrote:
Think of it like a scroll engine or a sprite engine - there's no "one way" to do it.

Sure, but trackers seem to be very popular, so I imagine most sound engines are influenced by them. I don't really care about the terminology, only about what it takes for an engine to be considered versatile enough to the average composer.
Re: Instruments vs. effects?
by on (#169747)
tokumaru wrote:
tepples wrote:
Much of the soundtrack of Haunted: Halloween '85 has volume controlled by instruments but pitch controlled by effects, especially portamento.

But in this project you were merely the coder, and had to abide by the musician's requirements, right? Would you have done anything differently if this was your own project?

I was the coder for the music, and basically was given a completed soundtrack and had to port a bunch of features from Famitracker to the Famitone engine they were using.

tokumaru wrote:
Couldn't effects be implemented by applying another envelope on top of the basic instrument? That should allow you to modify some parameters without having to define entirely new (and mostly redundant) instruments... For example, I could create a few sequences for moving the pitch up or down a few notes and apply those on top of any instrument a sound channel may already be using, in order to have portamento with any instrument. That sounds versatile enough, even if not as compact as traditional effects.

Only some of Famitracker's effects are equivalent to instrument macros (e.g. vibrato, or fixed duty). Some aren't, like targeted pitch slides, which have to stop based on a comparison with the current pitch.

In a case like vibrato, putting an envelope on top of another envelope is equivalent to implementing the effect, as far as I see it. I don't think you're saving much by trying to think of an effect as another envelope, if anything. I'd guess it would be about the same CPU cost as FT's existing vibrato, and take up more ROM space.

In the "envelopes only" case, I would just have multiple instruments, one with a vibrato pitch macro, one with a flat pitch macro, and the other macros are shared data. An instrument definition by itself is small (a couple of bytes), and turning vibrato on and off is just a matter of switching instruments.

In mine specifically: an "instrument" is 4 bytes, and a "macro" varies but the average is 15 bytes, and macros can be shared across many instruments. I had about 3000 bytes of macros, and 700 bytes of instruments in my soundtrack.

tokumaru wrote:
rainwarrior wrote:
The more common problem, I think, is feedback to the composer about the limitations and the data sizes they're producing.

Did you have to go through this or did you compose the entire Lizard soundtrack yourself?

Lizard is entirely mine, but I've gone through this kind of problem on a number of projects (not limited to NES, or music), one example of which was Haunted Halloween, as mentioned above.

If they do too much work in the dark, it's just not going to work when you turn the lights on. If you can give them the restrictions up front, but even better give them a tool they can use to validate the restrictions for themselves, you'll have less problems getting it online. I think most composers are quite willing to put up with a few arbitrary restrictions to make the implementation more efficient, but they're not happy to have to rewrite stuff they didn't realize wasn't going to work (and/or you won't be happy having to implement a bunch of extra features that could have been avoided).

Not intending to put anyone at fault on the HH project. I don't know how it ended up in that position, or how it was managed in general; I was just hired to do my part to make it work near the end of the project. I'm merely suggesting that extra limitations like "no effects" are normal rules for any game project, and artists/composers should be able to cope with them-- but only if they know about them! (...and how to validate!)
Re: Instruments vs. effects?
by on (#169757)
tokumaru wrote:
tepples wrote:
Much of the soundtrack of Haunted: Halloween '85 has volume controlled by instruments but pitch controlled by effects, especially portamento.

But in this project you were merely the coder, and had to abide by the musician's requirements, right? Would you have done anything differently if this was your own project?

My projects use my own music engine called Pently. I just like how it handles triangle+noise drums and loops that are shorter than a pattern. In fact, I covered most of the game's soundtrack in Pently before the musician became frustrated with the lack of portamento, despite my use of "bend in" and "fall off" instruments to make a mostly musically equivalent sequence. (Comparison)

Quote:
Quote:
Even so, you might want, say, "loud sax" and "quiet sax" without having to double up the instrument definitions.

This is what the volume column is for, right?

Some music engines don't have a volume column for the same reason they don't have effects: a score is expected to bake it into the instruments.

rainwarrior wrote:
If you can give them the restrictions up front, but even better give them a tool they can use to validate the restrictions for themselves, you'll have less problems getting it online.

I learned that over the course of Haunted: Halloween '85. The level background graphics provided by the artist were hitting lots of too-many-tiles errors and attribute clash. So I took a few hours to wrap the background converter in a shell that takes an INI file listing PNG files and 32-hex-digit palette strings and write a bunch more diagnostics. As for the game's music, I made my covers along with the relevant levels, but a miscommunication between my manager and the composer made it sound like Pently itself would be extended to handle new effects.

And yes, I now plan to make a FamiTracker-to-Pently converter in case the user wants to use FamiTracker's UI (rather than Pently's native MML- and LilyPond-inspired input format) and validate a score against Pently's limits. I just couldn't do so at the time because my time was occupied with parts of the game other than music, and I can't do so now because my time is occupied with the sequel.
Re: Instruments vs. effects?
by on (#169797)
I tend to use a mixture of effects and instruments. Like others have said, some effects are difficult to do with just instruments (like portamento) and some are just tedious (making a new vibrato macro for every few notes depending on pitch, making new instruments for every voice/duty cycle change, etc). Though all effect commands and no instruments is just as problematic - sometimes you have a lead using a specific sound and it is tedious to have to copy/paste the duty cycle, volume, and whatever other effects manually each time.

I've written a few songs that show both extremes:

Few effect commands:
https://www.youtube.com/watch?v=WluHxbGQkBk

Only effect commands (no instruments, note that this gets super cluttered around 2:55):
https://www.youtube.com/watch?v=D6EDwXayQY4

One instrument and no effect commands (except volume):
https://www.youtube.com/watch?v=bDoIIUBmGg4
Re: Instruments vs. effects?
by on (#169802)
Quote:
Sure, but trackers seem to be very popular, so I imagine most sound engines are influenced by them. I don't really care about the terminology, only about what it takes for an engine to be considered versatile enough to the average composer.

Trackers doesn't seem very popular in japan and that's where most games had their development done. Sound engines from games I've reversed where not influenced by trackers.
Re: Instruments vs. effects?
by on (#169804)
What was popular in Japan? MML-type things that take note names and produce hex bytecodes for the music engine? Or just entering the bytecodes manually?
Re: Instruments vs. effects?
by on (#169815)
I know a lot of people from Japan who use Famitracker or other trackers. PPMCK/MML was developed there, and was documented only in Japanese for a long time, so it has a persistent regional association. Same deal with Famitracker, developed in Sweden mainly in English, has a English-speaking regional association too, but as it's become more popular it has spread to Japan quite a bit.

I don't think Japan is tracker-adverse in any particular way, it's just that for NES homebrew music, PPMCK/MML was the first really major thing on the scene.

If you're not talking about modern homebrew music, then it's a whole different kettle of fish. Doommaster1994 did some surveying of that a while back:
Forum topic 8016: How NES Music Was REALLY Composed
Re: Instruments vs. effects?
by on (#169824)
Quote:
Modules made in FT can be very well suited to efficient playback, but it's not limited to that.


I see. I was somehow getting the impression that since i could do near perfect covers of many different games with supposedly different native game engines, an engine like that, which is able to 'do it all' would be bloated, but if performance cost can be held down by restrictive use, then that's indeed very reasonable.

Regarding comparison to other trackers, my main reference is said Jeskola Buzz which is the one i've stuck with since 2004 or so, when other daw software and/or sequencers doesn't cut it, mainly because the workflow and layout is extremely good, especially in the traditional tracker part of it. Of course, it's no good for nes music. But i miss how fast and clean you can edit your song or frame or loop during playback, compared to FT.

Regarding file format, is famitone or any or all of the other engines that have been released to the community assuming that unless an instruction states otherwise, the track or frame stays the same? (for example, track 3 played frame 5, so it will play frame 5 the next frame switch, too, unless there's new data) Or is it more like 'do this for that long time'? This may not be the greatest issue, but documentation like that might be handy when composing.
Re: Instruments vs. effects?
by on (#169826)
Yeah, I consider Jeskola Buzz a very unusual tracker. Definitely geared toward modular synthesis and MIDI kind of interfaces. I think its way of handling patterns is a little abnormal too, but it works fine.

Famitracker is unusual in its own ways too, but I find it works really well once you're oriented. I really do think it's the best chiptune tracker I've used; there are a lot of others at this point.


Famitracker has an "order" specified per-channel. By order, I mean a list of "patterns" to play (by number) that make up the song's order. If, for example, you want the noise channel to play a repeating drum pattern, you can enter the same number multiple times in the order.
Re: Instruments vs. effects?
by on (#169828)
WheelInventor wrote:
Regarding file format, is famitone or any or all of the other engines that have been released to the community assuming that unless an instruction states otherwise, the track or frame stays the same? (for example, track 3 played frame 5, so it will play frame 5 the next frame switch, too, unless there's new data) Or is it more like 'do this for that long time'?

NerdTracker II stores the order list as 5-tuples: (pattern for pulse 1, pattern for pulse 2, pattern for triangle, pattern for noise, pattern for DMC). So does FamiTracker, and I assume that engines based on FamiTracker text export do likewise. Thus if two consecutive frames play pattern 5 on track 3, the 5 will be stored twice.

Pently operates differently, not being quite as closely tied to tracker heritage. The order list is a list of "At X rows after song start, start playing and looping pattern X". Individual patterns can have different loop lengths, making an ostinato or short drum loop efficient. A pattern can be interrupted by the start of another pattern or by a stop command, making drum fills efficient. Patterns can also be reused and transposed across channels with different default instruments, making parallel motion of two channels efficient. And because pattern start commands have row granularity, not frame switch granularity, 2-channel echo is also efficient. But the price it pays for this flexibility is that frame switch commands take more bytes.
Re: Instruments vs. effects?
by on (#169837)
rainwarrior wrote:
I was the coder for the music

I didn't know that!

Quote:
Only some of Famitracker's effects are equivalent to instrument macros (e.g. vibrato, or fixed duty). Some aren't, like targeted pitch slides, which have to stop based on a comparison with the current pitch.

Couldn't I just write some slide macros with the few combinations of offsets and speeds I need? I know that's not nearly as efficient space-wise, but as long as I don't need many combinations of offsets and speeds, I think this could work.

Quote:
I don't think you're saving much by trying to think of an effect as another envelope, if anything.

The playback code is simpler, I guess, since the code for processing envelopes is already there for instruments, and could be reused for these effects.

Quote:
I'd guess it would be about the same CPU cost as FT's existing vibrato, and take up more ROM space.

Yes, songs may take more space, depending on how varied the effects are.

Quote:
In mine specifically: an "instrument" is 4 bytes, and a "macro" varies but the average is 15 bytes, and macros can be shared across many instruments.

How exactly do 4-byte instruments work? Can an instrument do anything without a macro? The "macros" you're talking about are what Famitracker calls "sequences"?

tepples wrote:

I do like your version much better... it's cleaner and more fluid.

RushJet1 wrote:

This one is awesome!

Quote:
Only effect commands (no instruments, note that this gets super cluttered around 2:55):
https://www.youtube.com/watch?v=D6EDwXayQY4

This... is pretty insane!

Quote:
One instrument and no effect commands (except volume):
https://www.youtube.com/watch?v=bDoIIUBmGg4

Sounds pretty good, specially considering the harsh restrictions!
Re: Instruments vs. effects?
by on (#169839)
tokumaru wrote:
How exactly do 4-byte instruments work? Can an instrument do anything without a macro? The "macros" you're talking about are what Famitracker calls "sequences"?

I guess "macro" was a term picked up from PPMCK/MML, but yes it is synonymous with "sequence" in Famitracker. I also often call them envelopes. Sorry if this is confusing. ;)

Yes, an instrument is just a selection of macros. One for each of the type of envelopes you can assign. (Famitracker actually has 5 envelope types, but "hi pitch" is not very useful; other data can be associated with an instrument, too, but it's not very essential.)

tokumaru wrote:
Couldn't I just write some slide macros with the few combinations of offsets and speeds I need? I know that's not nearly as efficient space-wise, but as long as I don't need many combinations of offsets and speeds, I think this could work.

Depends on how much you want to use it. A few real problems:
1. Every interval is a different distance for each starting pitch, so if you use an envelope to make a slide from C-3 to E-3, you can't really use it for any other pair of pitches.
2. Determining the precise values needed for a targeted slide envelope is tedious.
3. Slow pitch slides require very large envelopes.
4. The pitch slide effects all have configurable speed; each envelope only supports one speed.

If you want to do it rarely, sure, it's fine to create a couple of special instruments for specific slides. If you want to do it often, no, it's not really feasible without dedicated code.

Again, though, if you don't have this feature you just compose around it. You can still do plenty of pitch effects, it's just difficult to do a targeted slide from one precise pitch to another without implementing the effect.

There are other ways to handle pitch, too. You could create an expanded pitch table with 16 divisions of each semitone (like how MODs do fine pitch), allowing pitch envelopes to apply logarithmically so you could make a "5 semitone slide" envelope that could be used on any note of the scale. This is feasible from an engine perspective, but it's not compatible with Famitracker's linear model of pitch.

tokumaru wrote:
The playback code is simpler, I guess, since the code for processing envelopes is already there for instruments, and could be reused for these effects.

I think the biggest problem with effects, is that each class of related effects requires RAM to maintain its state; this is true whether or not you try to implement it as an envelope. The actual code complexity of an individual effect tends to be pretty small, the inefficiency is usually just due to having to implement a collection of them. Each effect is one more bit of state you have to track and update every frame, multiplied by the number of channels that effect may apply to.

I was saying that it's true that a few of the effects could in theory be implemented by reusing other envelope code, but in general I think the extra envelope data would end up being bigger than the code you saved, and RAM usage would be the same or worse. CPU for any given effect is probably similar or better than an envelope. In some cases it could just be an approximation of the effect too. I understand the train of thought but I don't think there's much to gain here. The complexity of the effect code is not really the problem with effects, and probably not worth trying to "simplify" by replacing with extra envelopes.


The thing is, though, the set of 4 base envelopes that an instrument has (volume, pitch, arpeggio, duty) are very versatile as-is, easy to use, and they can all operate with the same code. As I said, you can just make vibrato as an envelope that's part of an instrument; you don't require an effect feature to add it. There's a lot of functional redundancy in what Famitracker's effects do; they're just there to support different workflows and make it easier to produce music. This is why my answer to your OP question was that you don't really need effects; you've got tons of functionality already with just the 4 envelopes.

You could do something like let a vibrato effect replace an instrument's pitch envelope if it's blank, rather than acting as an overlaid thing. That way the effect isn't actually doing more runtime work, just a 1-time setup thing, but if you do stuff like that it starts to get really divergent from how something works/sounds in Famitracker. I dunno how much you want to go down the road of either modifying Famitracker, or giving your composer features they can only hear if they do an export and run in an emulator, but in general I'd avoid this.

My approach was just to pick a flexible/versatile subset of features from Famitracker; I didn't really want to do stuff that Famitracker couldn't, even if that seemed convenient, because I wanted it to sound exactly the same in the tool as in the game, and to have no confusion about what's going to happen. Anything that's not supported throws an error during export, and everything that is supported should work identically, as much as it can. (I did make a few subtle differences.)
Re: Instruments vs. effects?
by on (#169843)
rainwarrior wrote:
Sorry if this is confusing. ;)

That's OK, thanks for clearing that up.

Quote:
1. Every interval is a different distance for each starting pitch, so if you use an envelope to make a slide from C-3 to E-3, you can't really use it for any other pair of pitches.

Well, I do plan on using tables to create a fixed number of steps between notes, which will allow me to address them linearly... That should take care of this problem, right?

Quote:
2. Determining the precise values needed for a targeted slide envelope is tedious.

This too shouldn't be such a big problem if notes can be addressed linearly, I guess.

Quote:
3. Slow pitch slides require very large envelopes.

That's true.

Quote:
4. The pitch slide effects all have configurable speed; each envelope only supports one speed.

Again, true.

Quote:
If you want to do it rarely, sure

That will probably be the case.

Quote:
There are other ways to handle pitch, too. You could create an expanded pitch table with 16 divisions of each semitone (like how MODs do fine pitch), allowing pitch envelopes to apply logarithmically so you could make a "5 semitone slide" envelope that could be used on any note of the scale.

Yes, that's exactly what I plan to do (I'm replying as I read, so I mentioned it before knowing you'd suggest it).

Quote:
I think the biggest problem with effects, is that each class of related effects requires RAM to maintain its state;

Yeah, I also though about this after my last post. If I can have 2 or 3 effects per channel, that could add up to a significant amount of RAM, and I really am trying to keep RAM requirements low.

Quote:
this is true whether or not you try to implement it as an envelope.

Well, I'm not so sure about that. For envelopes, all I have is a pointer to the current envelope step. If I add a second envelope I'll just need another pointer. My idea is to get the note from the song and the channel's volume, apply all the instrument modifiers, and on top of that apply all the effect modifiers, and then write the final result to the APU. I don't really need any state because each step of an envelope has all the info I need, so I can construct the final APU data from scratch every frame.

Quote:
I was saying that it's true that a few of the effects could in theory be implemented by reusing other envelope code, but in general I think the extra envelope data would end up being bigger than the code you saved, and RAM usage would be the same or worse.

I agree about the data being bigger, but I guess my approach here is like "I don't want to make the engine overly complex, and I probably won't need effects very often, but I also don't want to completely get rid of them", so this is a quick and simple way I found to allow effects if they are really necessary, but when I use them I'll know that the price is precious ROM space. If I don't use them, I don't lose anything.

As for RAM usage, 2 bytes for an extra pointer per channel sounds MUCH better than whatever I'd need for the states of combined effects (I'm guessing it would be at least 2 bytes per effect, but maybe some need more?). Unless I'm missing something here, but I'm fairly sure I can create the APU data from scratch every frame like I described above.

Quote:
This is why my answer to your OP question was that you don't really need effects; you've got tons of functionality already with just the 4 envelopes.

And I've been listening to some good examples of that.

Quote:
I dunno how much you want to go down the road of either modifying Famitracker, or giving your composer features they can only hear if they do an export and run in an emulator, but in general I'd avoid this.

I want to keep things simple on all fronts. :wink: I'm not sure Famitracker will be involved at all, though.

Please note that I'm not trying to defend my choices or anything like that, I'm honestly looking for some good feedback so I'll end up with a sound engine that isn't too crippled but isn't so heavyweight either, and so far the replies have been very useful. If you think that things I say are stupid, or if you disagree with me on anything, please keep the replies coming. If you can validate my ideas too, that'd be great, so I know I'm not going forward with something completely weird. Audio is the thing I'm least experienced with when it comes to game programming, so I really need to be told that I'm going in the wrong direction if that's the case. Hopefully that'll make me better at this.

EDIT: I just realized I completely neglected to mention that the RAM requirements I mentioned (2 bytes per envelope) is only possible if the different types of envolopes aren't in separate reusable sequences, but instead pitch, duty and volume sequences are all packed together in the same sequence. I do realize that, again, this needs more ROM than separate combinable sequences each with their own loop points. This is not final though.
Re: Instruments vs. effects?
by on (#169914)
Lots of engines combine volume + duty into a single macro (for obvious reasons). In Famitracker arpeggio and pitch macros are kind of exclusive anyway, so you could combine them as well and have some sort of mode flag to control it. Again, if you do this kind of thing, you're divergent from Famitracker, but that's your call. This would save RAM and CPU and instrument size all at once.

Famitone, for example, forbids duty envelopes and just keeps a single duty state per instrument instead, I think. For HH, I believe I added duty envelope functionality by combining it with volume, because it was sufficient for the music I had to work with. (Combining them puts restrictions on looped envelopes, sometimes you can't sensibly combine them.)

Famitone also "simplifies" pitch macros by removing their ability to accumulate. i.e. in FT you can slide pitch down with a 1-byte looping envelope of "1", but in Famitone you can't loop like that, and I think it stops at 127, or something, so it becomes useless for long falloff slides. He did this presumably because he thought it was simpler/smaller to implement a single byte pitch offset than a cumulative pitch state, but it's one of many decisions that results in a significant loss of functionality from the FT counterpart.

tokumaru wrote:
Well, I'm not so sure about that. For envelopes, all I have is a pointer to the current envelope step. If I add a second envelope I'll just need another pointer. My idea is to get the note from the song and the channel's volume, apply all the instrument modifiers, and on top of that apply all the effect modifiers, and then write the final result to the APU. I don't really need any state because each step of an envelope has all the info I need, so I can construct the final APU data from scratch every frame.

No, you need 2 bytes of state for your envelope. Any effect that can be sufficiently implemented as an envelope has no more than 2 bytes of state. (Per channel.) Take a look at FT's NSF Driver source if you want to see how it organizes RAM, and how small individual effect implementations are.

The other thing is all the effects that can be implemented as an extra envelope are already kinda doable as instrument envelopes anyway. They're a redundancy. The effects that can't be envelopes are the ones that matter more, functionally.

The suggestion that effects could be implemented as more envelopes to me is analogous to suggesting that addition could be implemented as a lookup table instead. Sure it could, but the reasons why you wouldn't are very similar.

Sorry, I should just drop the argument. I think if you start to look into what specific effects do you'll understand why it's not really worthwhile. If you really want to simplify something, just don't implement them at all, or don't implement them as overlaid effects, or make them temporarily replace instrument envelopes instead, etc.

tokumaru wrote:
I want to keep things simple on all fronts. :wink: I'm not sure Famitracker will be involved at all, though.

The reason I intentionally maintained a strict subset of Famitracker, instead of making convenient/interesting implementation decisions to make an engine divergent from it is that Famitracker is a great composing tool. The kind of feedback and quick iteration you get from it is fantastic. It's easier to use and easier to write better music with it than, for example, PPMCK/MML's compile-and-check process. If you make something that works with FT, you gain all the power of that tool, and it has a large community of skilled users who already know how to work with it.

There's lots of ways you can implement a music engine for NES, and there are a bunch of possibilities that I think would be interesting for an engine to have (e.g. the logarithmic pitch tables I suggested earlier), but personally I would only pursue them for a one song demo, or some proof of concept, where the amount of music work needed is more limited.

In a commercial project, if you can make good music faster with one tool than another, you should use it. If I was paying someone by the hour, I definitely wouldn't give them PPMCK/MML instead of FT to work with, if you understand what I'm saying, but even in a non-commercial project your time is still limited by fatigue and availability. If you can iterate faster, you can make better music with the time you've got.

Whatever you wanna use is up to you, but this is the reason why I stick with FT. There's lots of things about FT that I would/could have done differently if I was starting from scratch, but I'd rather not start from scratch if my real goal is just to make a good game soundtrack I can use in my game.
Re: Instruments vs. effects?
by on (#169930)
Say there's an MML-style engine that produces smaller bytecode than FamiTone, let's say because of more efficient looping and transposition. And further say someone's composing for an NROM or CNROM game. Would the following process be practical?

  1. Compose in FT
  2. Convert FT text to MML once using engine's conversion tool
  3. Optimize MML by hand
  4. Compile-and-check optimized MML
Re: Instruments vs. effects?
by on (#169931)
I can't seriously entertain a hypothetical question like that. There's nothing concrete to make a decision about. How many hand optimizations are you making in step 3? How many would be too much? How much ROM space do you need to save? Get a real project, and you'll know the answer.

Practical is anything that works when you're actually making something. Jeroen Tel got it done by typing hex values into a text file.

My answers above are an explanation of my own motivations for the decisions I've made, not laws of practicality.
Re: Instruments vs. effects?
by on (#169936)
rainwarrior wrote:
Lots of engines combine volume + duty into a single macro (for obvious reasons). In Famitracker arpeggio and pitch macros are kind of exclusive anyway, so you could combine them as well and have some sort of mode flag to control it. Again, if you do this kind of thing, you're divergent from Famitracker, but that's your call. This would save RAM and CPU and instrument size all at once.

It's good to know I'm not doing something crazy in this respect.

Quote:
(Combining them puts restrictions on looped envelopes, sometimes you can't sensibly combine them.)

Yeah, I get that.

Quote:
Famitone also "simplifies" pitch macros by removing their ability to accumulate. i.e. in FT you can slide pitch down with a 1-byte looping envelope of "1", but in Famitone you can't loop like that, and I think it stops at 127, or something, so it becomes useless for long falloff slides.

I see... I wouldn't support this either, since it'd need more state (a dynamic pitch).

Quote:
No, you need 2 bytes of state for your envelope.

True, but that's just wording it differently. :wink:

Quote:
Any effect that can be sufficiently implemented as an envelope has no more than 2 bytes of state. (Per channel.)

But each effect only affects one parameter, while an envelope can deal with duty, volume and pitch all at once.

Quote:
Take a look at FT's NSF Driver source if you want to see how it organizes RAM, and how small individual effect implementations are.

Will do, but I'm already quite set on not implementing effects the way FT does. Whether I'll do it in some other way is another issue.

Quote:
The other thing is all the effects that can be implemented as an extra envelope are already kinda doable as instrument envelopes anyway. They're a redundancy.

I agree, I'm just hoping I can save a little space by not duplicating entire instruments, when a little modifier would do. I honestly don't know how often these situations will come up, though.

Quote:
The effects that can't be envelopes are the ones that matter more, functionally.

Guess I'll probably have to live without them.

Quote:
The suggestion that effects could be implemented as more envelopes to me is analogous to suggesting that addition could be implemented as a lookup table instead.

I actually do that too!

Quote:
Sorry, I should just drop the argument. I think if you start to look into what specific effects do you'll understand why it's not really worthwhile.

I'm seeing this almost as the equivalent of not implementing anything at all. If I don't ever use these hacked-in effects, all I'm losing is 2 bytes of RAM pet channel, which I can claim back at a later time if necessary. BTW, I'm not arguing. :)