Realtime audio visualizer (for projection mappers)

Hi guys,

I’m looking at making 3d stuff react ‘realtime’ to an audiostream. I’ve been looking at this tutorial:

While its clear PlayCanvas can visualize an audio source that’s already been loaded, I’m unsure its capable of a stream, such as the project outlined here (click on the mic)

Is this possible with PlayCanvas?

PlayCanvas does not currently support streaming audio sources although this is something that we will implement at some point when we get a chance. However, you could just use the Web Audio API directly yourself which is basically what we are going to do as well. Here’s some info about streaming here:

As Vaios mentioned it’s not supported in the engine at the moment.

I was thinking you could hack it in relatively simply. The main difference is that at the moment we use AudioBufferSourceNode in the sound instance. This requires the full sound buffer to be downloaded in advance. If you could replace the buffer node with a [MediaStreamAudioSourceNode][1] then I it would stream the audio and the node connections should work as normal. So possibly overriding _createSource in the sound instance would “just work”.

It might be a little more complicated though because of the sound asset preloading and stuff. I’d have to experiment.

However, as Vaios says, you can use audio nodes directly in your script code and everything will work. If you’re not using the asset system (which is fine because you’re getting input from the mic) then you can probably use the tutorial code you linked to just the same.

Thanks for the response guys! Ok I’ll have to wrap my head around the MediaStreamAudioSourceNode stuff first, then I’ll try applying it to the tutorial.