Lead Music Designer · Interactive Audio · AI + Adaptive Systems
// selected credits
Lead Music Designer at PlayStation Studios for 17 years. That meant being the person who understood both what the composer intended and how the game engine would actually play it — and closing that gap, every time, on titles with hundreds of millions of players.
Interactive music is a systems problem. A score doesn't exist in isolation — it exists inside a game state machine, responding to player actions, managing transitions between states that the designer mapped and the player discovers in some unpredictable order. Getting that right requires thinking like a composer and an engineer simultaneously.
That intersection is where the work has always lived. And it's what makes the AI moment interesting: the tools are finally sophisticated enough to work at that level, if you know how to ask.
A four-layer MCP architecture for natural language control of Wwise via WAAPI. Describe adaptive music behavior in plain English — the system builds it in the engine.
View in Lab →Independent consultancy for studios and creators navigating adaptive music, AI-augmented production, and the mechanics of interactive audio. Source-First Method training available.
BPM →A compositional methodology for AI music generation developed over 13+ months. Approaches prompt engineering as music direction — not search.
Read more →Interactive audio curriculum at Northwestern Michigan College and Southern Utah University. Adaptive music theory, Wwise implementation, and AI tools for game audio practitioners.
About →Recording project. Esoteric electronic ambient built on synth-heavy arrangements — Rhodes, Hammond B3, analog gear. The working principle is that raw vulnerability beats coolness every time, and that leaving space in a track is as compositional a decision as filling it.
Also records as AIDAxPOUROVER.