A friend of mine recently contacted me about a (very part-time) game project that he's heading up titled, until future notice, "Regime." Set in a dystopian future civilization, Regime centers around the plight of citizens rebelling against the ever-tightening grip of the ruling government. The unusual thing about this particular project is that the game itself is actually broken up into several sub-games, namely a farming simulation a la Harvest Moon, a story-driven investigatory adventure, and a tactical turn-based squad combat game, among others.
At the very least, I'll be providing a soundtrack for the project and will likely contribute on a few other fronts as well, likely level design or programming. Unfortunately, as everyone involved is employed full-time, the project won't see the light of day for the next couple years. However, prototyping is currently underway and I have a few music concepts already:
Regime's multiple game types are both a blessing and a curse from a composition perspective - I can take liberties in exploring a variety of musical styles, but I still need a way to tie it all together. Eventually I'll have to settle on a distinct voice for the game but, in this early pre-production stage, I'm not too worried about that. It's nice to, for once, have time to write different variations on themes and experiment with styles and instrumentation.
GENERATIVE MUSIC/ARTIFICIAL LIFE SIMULATION
I recently discovered some intriguing generative music projects over at a website called Earslap. While generative techniques are definitely not something I'd use for the purpose of creating music in the traditional sense of the word, they can be used to enhance already interesting audio/visual experiences. Rez is an excellent example of this and, to a lesser extent, the more recent Bit.Trip games (it's arguably most effective in Beat).
For some time now I've wanted to create some sort of AI sandbox in order to test things like steering, flocking, and see what sort of interesting emergent behavior I can create with only a few different agents. It recently occurred to me that I could make such a thing more engaging by adding in musical noises - perhaps different lifeforms could emit different pitches in a scale, parents would be able to recognize their offspring by the sound they make, provided they're within earshot.
I imagine I'd be able to start such a project once I've achieved some measure of economic stability, as its currently agonizingly difficult to find the motivation (not to mention I'll likely be tapped to help prototype Regime in the near future). Finding the right engine/dev environment for this will be first on the list, as I'm less interested in writing an engine than creating interesting behaviors but would still like a moderate amount of control over visual effects and such things.
Speaking of jobs, I've recently been contracted to help with some K-12 curriculum development at DigiPen, specifically recording lectures for a Java AP class. However, it's recently become apparent that the hours aren't initially going to be consistent enough to make it sustainable, so unfortunately I'll have to look for other work in the meantime. Back to the job boards for me!
A RED, RED ROSE
Last but not least, this past weekend I had the pleasure of premiering my choral setting of Robert Burns' famous poem "My Luve's Like a Red, Red Rose" with the choir in which I currently sing, the Cascadian Chorale. The experience has inspired me to try to write choral music more often and, I'm happy to say, the recording turned out quite well, all things considered. You can listen to it here.
That's all for now! I'll post more updates on the job hunt and the progress of these various projects once things progress.