0

I am new to Web Audio and web development in general, but I'm working on a project that involves capturing audio from the browser and building a live audio communication channel between two or more parties.

So far, I'm planning to use WebTransport with WebCodecs, since WebRTC isn't the best for a server-client architecture model, but I'm facing some problems in playing streamed AudioData chunks, the output of a WebCodecs decoder.

All the solutions I found is using the Web Audio API and AudioContext. I read some queuing the Audio Frames for rendering in separate buffers and other rendering in an AudioWorklet, I don't know if it is rejoining the chunks in a MediaStream then linking it to an AudioWorklet.

Also I found this from 2015 saying that Web Audio API isn't compatible with streams, is this still the case?

My problem is that I can't seem to find a clear path to follow, it may be because I'm new to web development I don't know, but what I know is that neither of the paths are clear and I don't know what is best for my use case.

Any guidance?

  • 1
    Welcome to Stack Overflow! Sadly SO is not the best platform for this question. A good first step would be breaking this problem down into smaller problems first. Attempt to solve each problem in abstraction first before stitching everything together. If a problem seems amorphous and you aren't certain how to begin solving it, that is usually a sign that it can be broken down further. – fdcpp Sep 13 '21 at 15:01

0 Answers0