Excited Waves Acoustics

System Overview

Excited Waves Acoustics

What Is It

Excited Waves Acoustics is a runtime acoustics system designed as an engine- and middleware-agnostic SDK with a current integration in Unreal Engine 5.6. Every sound source gets its own acoustic properties automatically – the system uses optimized raycasting strategies to evaluate surrounding geometry and drives reverb blending, absorption compensation, cross-room blending, and early reflections in real time.

The Problem

Traditional approaches to game audio reverb have significant limitations:

Manual reverb zones require level designers to hand-place and tune reverb volumes for every room and outdoor area. This is time-consuming, error-prone, and scales poorly with level complexity. When level geometry changes during development, all zones must be updated manually.

Baked acoustics pre-compute acoustic responses at probe points. This improves quality but introduces a bake step into the iteration pipeline – every geometry change requires re-baking. Baked systems also do not respond to runtime geometry changes like destruction or procedural generation.

Static approaches in general fail in dynamic environments. If a wall is destroyed during gameplay, reverb zones and baked probes have no way to reflect the changed geometry.

How Excited Waves Solves It

Excited Waves Acoustics works entirely at runtime:

  1. Ray scanning – the detector casts rays in a configurable pattern to sense surrounding geometry. Verification rays distinguish true indoor spaces from openings like doors and windows.

  2. Distance binning – ray hits are classified into Close, Mid, Far, and Outdoor distance bins. These bins map directly to reverb bus sends (short tight reverb for close surfaces, longer tails for far surfaces, etc.).

  3. Material absorption – physical surface types on hit geometry are mapped to absorption coefficients. Mean absorption drives reverb preset selection (e.g. switching between a reflective concrete reverb and a dampened carpet reverb).

  4. Cross-room blending – when the listener is in a different room than the emitter, the system blends the listener's reverb characteristics into the emitter's sends, creating natural transitions between spaces.

  5. Early reflections – first-order reflection positions are calculated from ray hits. In the current integration, Wwise Reflect is used to render early reflections for each sound source.

The system adapts continuously to any environment – including geometry that changes at runtime.

Key Features

Minimal Setup

One component per emitter (or listener). No manual acoustic zones are needed. The detector scans its surroundings and produces results automatically. The only additional setup required is to create reverb busses in Wwise and assign them in the plugin's settings.

Per-Emitter Acoustics

Each sound source gets its own acoustic analysis. A sound playing inside a small room gets tight close reverb while a sound in the same level but outdoors gets open-air reverb – simultaneously and without zones.

Dynamic Geometry

Integrates with Chaos Destruction out of the box. When geometry is destroyed, nearby detectors re-scan and produce updated acoustics.

Frame-Limited Performance

Rays are spread across multiple frames with a configurable budget (e.g. 8 rays per frame). Additional strategies like LOD-based configurable distance thresholds ensure stable CPU cost.

Material-Based Absorption

Leverages Unreal Engine's Physical Materials. Map surface types to absorption coefficients in Project Settings – the system uses them to select appropriate reverb presets and reduce send levels in absorptive environments and/or to select different sets of reverb busses specifically for this absorption level.

Cross-Room Reverb Blending

When a sound emitter is in one room and the listener in another, the plugin blends reverb from both spaces based on distance. This creates smooth, natural transitions through doorways and openings without any manual zones.

Wwise Integration

Current integration works with Audiokinetic Wwise:

  • Reverb aux bus sends driven by detection results

  • Absorption-based preset switching with crossfades

  • Early reflections via Wwise Reflect

Middleware-Agnostic Architecture

The core detection engine is independent of any audio middleware. The Wwise integration is one of the possible implementations. Other middleware (MetaSounds, FMOD, etc.) can be supported through additional adapters and will be added in the future.

Comparison with Other Solutions

Excited Waves
Steam Audio
Project Acoustics
Wwise Spatial Audio

Core approach

Runtime ray-based environment analysis

Ray tracing with optional propagation baking

Precomputed wave simulation

Room- and portal-based spatial model

Geometry

Uses scene collision / meshes at runtime

Requires acoustic components or geometry export

Requires full static scene export

Requires manual rooms and portals

Baking

Not required

Optional (for propagation)

Required (offline bake)

Not required

Dynamic environments

Fully supported (doors, destruction, moving geometry)

Partially supported

Static geometry only

Limited (portal state changes)

Setup workflow

Add component to actor, create/assign reverb busses

Material mapping and scene setup

Probe placement + cloud bake

Manual room and portal layout

Runtime CPU cost

Stable, frame-limited ray budget

Varies with scene complexity

Very low (runtime lookup)

Low

Platform Support

  • Unreal Engine 5.6.1

  • Windows (Win64)

  • Audiokinetic Wwise 2024.1 or later (for Wwise integration)

Editions

The Trial edition is fully functional in Unreal Editor for evaluation and prototyping. Packaging a standalone game build requires a Full license. Contact us: contact@excitedwaves.comenvelope

Last updated