Music Theory - Research 2018-08-31
In this post:
- A list of types.
- Some concerns I have, now that I'm looking at this with fresh eyes.
- Ideas for directions to take this, when I'm not feeling blocked by language choice.
- The plan for how to do the initial implementation.
Tiny Music is currently rather small, even for what it's meant to accomplish.
It defines several types:
- Duration, which is functionally a wrapper around positive rationals (which, practically speaking, could be based on u8, and work fine)
- NoteType, which is currently START, TIED, and REST, and will soon swap out START for stuff like literal pitches and applying an interval to the preceding pitch. On reflection, the use of TIED, while probably necessary, enforces some surprising behavior.
- Note, which combines Duration and NoteType, and also allows for 0 or more augmentation dots.
- BeatType, which appears to be unused. It's the start of an attempt to apply concepts from a previous entry, and consists of PRIMARY_ON_BEAT, SECONDARY_ON_BEAT, TERTIARY_ON_BEAT, PRIMARY_OFF_BEAT, and SECONDARY_OFF_BEAT.
- BeatStructure, which combines a Duration with a sequence of sequences of positive integer beat lengths. It seems like this should have provided a means of iterating out BeatTypes, but I guess I never did that.
- Measure combines a sequence of Notes with a BeatStructure, and validates at runtime that they actually match in length.
- MultiMeasure is a sequence of Measures. Along with Duration, this is going to be a newtype in Rust.
Something that hadn't occurred to me before is that the way I'm representing Notes means that simultaneous notes must be accomplished via simultaneous measures. In terms of what the code sees, this means that intervals need to be able to look "across" voices. In terms of what the input looks like, this means that stuff like music for stringed instruments will require some serious conversion before it's in a state that makes sense.
Because of the wide variety of things that can be done in a rhythmic fashion, it seems like I should aim to parameterize NoteType by the type of actions that can be taken. Doing that, it may then be possible to represent the same stretch of music at different levels of abstraction: convert a set of chords made up of abstract notes into a set of interlinked intervals realized through four voices, then render as concrete pitches.
To get the implementation done, I'll want to introduce types in approximately the order they're in up there.
Next time, I start work on a tiny_music crate.