I am new to Web Audio and web development in general, but I'm working on a project that involves capturing audio from the browser and building a live audio communication channel between two or more parties.
So far, I'm planning to use WebTransport with WebCodecs, since WebRTC isn't the best for a server-client architecture model, but I'm facing some problems in playing streamed AudioData chunks, the output of a WebCodecs decoder.
All the solutions I found is using the Web Audio API and AudioContext. I read some queuing the Audio Frames for rendering in separate buffers and other rendering in an AudioWorklet, I don't know if it is rejoining the chunks in a MediaStream then linking it to an AudioWorklet.
Also I found this from 2015 saying that Web Audio API isn't compatible with streams, is this still the case?
My problem is that I can't seem to find a clear path to follow, it may be because I'm new to web development I don't know, but what I know is that neither of the paths are clear and I don't know what is best for my use case.
Any guidance?