Audio Middleware

  • A runtime system that makes audio behave intelligently in software.

  • Software that sits between your game/app and the low-level audio engine.

  • There is no direct open-source equivalent to FMOD/Wwise with the same feature level.

Using an Audio Middleware
  • Without middleware:

    • Developers manually code audio logic using low-level APIs (OpenAL, WASAPI, CoreAudio, etc.)

    • Audio designers depend on programmers for every change

  • With middleware:

    • Designers build audio behavior visually (events, parameters, states)

    • Programmers integrate the middleware once

    • Audio becomes data-driven instead of hardcoded

How it works
  • Middleware is not β€œplay sound(filename)”.

  • It is:

Event: Footstep
  - surface type = metal β†’ sound set A
  - speed > 5 β†’ add layer B
  - stamina < 20% β†’ pitch down
  - random variation
  - spatial rules
  - mixing rules
  • The game engine does not decide sounds; it publishes game state and events.

  • The game engine should not think:

    • which sound to play

    • how loud it is

    • how it blends

    • how it transitions

  • It should think:

    • what happened in the game

    • what continuous state exists

    • what semantic data FMOD needs

  • The game engine designs an API like a telemetry system.

  • Game Engine should do something like:

    • FMOD initialization

    • Bank loading

    • Event registry

    • Parameter registry

    • Threading policy

    • Memory policy

FMOD

Wwise

  • Older, some Triple-A studios use it.

Elias

  • Less common

Fabric

  • Older Unity middleware

CriWare (CRI ADX / Atom)

  • Popular in Japan