webaudio vs physics

Hi all,
In this example, the music playing interacts with one of the physical objects in the scene (the red surface).

How it works:

The main idea is to analyze the decibel variation of the frequencies in each sample with respect to the previous one and add these variations to the vertical speed of the object (with some modification).
For this purpose the webaudio API has a node called analyserNode with methods focused on sound analysis.
The input of this node is connected to the masterVolumeNode of the scene to receive the samples and extract data. The extracted data is stored in an array, from which the elements we are interested in processing are selected.
The analyserNode output is connected to the context.destination of the scene to listen to the music.

Unless I am wrong, some properties and methods of the WebAudioSoundManager class doesn’t appear in the documentation. I recommend you to use console.log(this.sound) to see all the properties and methods.



Man, nice job!
Do you know if the API can get the BPM from musics?

Thanks @Thyebe,

Webaudio API hasn’t any explicit method for that. But with the analyserNode you can access to the raw data of the music (frequencies and levels) and apply your own algorithm.
Take a look to this interesting article:


1 Like