13

Please, is there any easy way to stream (broadcast) media file (ogg, mp3, spx..) from server to client (browser) via NODE.js and possibly SOCKET.IO?

I have to record audio input on the server side and then be able to play it realtime for many clients. I've been messing with binary.js or socket.io streams but wasnt able to get it right.

I've tried to encode audio input with speex, vorbis or lame and then load it by FS to client but I havent been successful. Or do i have to capture PCM and then decode it in browser?

Any suggestion on this, nothing Ive found ever helped me.

Many thanks for any tips, links and ideas.

asked Apr 30, 2014 at 19:34
3
  • I'm also interested in live audio streaming. The more I'm reading about it, node streams, back-pressure, buffering and all the stuff you need to take care of, the less I know how to tackle that. There's a nice post. I'm trying to avoid using SHOUTcast/Icecast but maybe that'd be the easiest way. Commented Aug 13, 2014 at 17:49
  • topic on MDN Commented Aug 13, 2014 at 18:04
  • What is really important for me is to eliminate delay as much as possible. I don't care about quality. The audio/video html5 element by itself is delayed in all major browsers with everything Ive tried so far. So audiocontext and usermedia are probably the way. But still no success for me. Commented Aug 14, 2014 at 14:48

3 Answers 3

15

You'll want to look for packages that work on Streams and from there it's just about piping your streams to output as necessary. Using Express or just the built-in HTTP you can accomplish this quite easily. Here's an example built around osx-audio which provides a PCM stream, lame which can encode a stream to mp3, and Express:

var Webcast = function(options) {
 var lame = require('lame');
 var audio = require('osx-audio');
 var fs = require('fs');
 // create the Encoder instance
 var encoder = new lame.Encoder({
 // input
 channels: 2, // 2 channels (left and right)
 bitDepth: 16, // 16-bit samples
 sampleRate: 44100, // 44,100 Hz sample rate
 // output
 bitRate: options.bitrate,
 outSampleRate: options.samplerate,
 mode: (options.mono ? lame.MONO : lame.STEREO) // STEREO (default), JOINTSTEREO, DUALCHANNEL or MONO
 });
 var input = new audio.Input();
 input.pipe(encoder);
 // set up an express app
 var express = require('express')
 var app = express()
 app.get('/stream.mp3', function (req, res) {
 res.set({
 'Content-Type': 'audio/mpeg3',
 'Transfer-Encoding': 'chunked'
 });
 encoder.pipe(res);
 });
 var server = app.listen(options.port);
}
module.exports = Webcast;

How you get your input stream might be the most interesting part, but that will depend on your implementation. The popular request package is built around Streams as well though, so it might just be an HTTP request away!

answered Sep 25, 2014 at 1:33
Sign up to request clarification or add additional context in comments.

3 Comments

you have and example of this with socket.io?, you have this on http: app.get('/stream.mp3', function (req, res) { res.set({ 'Content-Type': 'audio/mpeg3', 'Transfer-Encoding': 'chunked' }); encoder.pipe(res); }); and with socket.io ? something like this? socket.on('audio',function(data){ encoder.pipie(data); //where is the emit or send to the client? });
Haven't tried this solution yet, but looks interesting. I'll let you know how this went.
can anyone point me to some resources that could achieve this in Flask/Flask-SocketIO?
5

On the web browser you have the HTML5 video element and the audio element. Both of them have sources. Each web browser supports different codecs natively. So you'll want to watch out for that if you're trying to stream mp3.

You don't need socket.io, you only need HTTP. Your app is reading a file, music.ogg, and for each chunk it reads, it will send it through the http server. It will be one single HTTP request that's kept open until the file is transferred.

Here's how your html will look:

<audio src="http://example.com/music.ogg"></audio>

And your nodejs code will be something like this (haven't tested this):

var http = require('http');
var fs = require('fs');
http.on('request', function(request, response) {
 var inputStream = fs.open('/path/to/music_file.ogg');
 inputStream.pipe(response);
})

I'm only using the ReadableStream.pipe method on the inputStream and the http and fs modules for the above code. If you want to transcode the audio file (for example, from mp3 to ogg) you'll want to find a module that does that and pipe the data from the file into the transcoder then into response:

// using some magical transcoder
inputStream.pipe(transcoder).pipe(response);

The method will call end on the stream whenever it's finished writing so that the HTTP request will be finished as soon as the file is done being read (and transcoded).

2 Comments

Thanks a lot for an answer, but what I really meant was to make something as "live stream" synchronised to all clients, so do you think is there any way to for example chop the audio source to parts (or use recorded pcms as source) and provide those parts to clients? After all, the WebRTC way is having really good results but isn't sustainable for more clients connected.
You need to use createReadStream instead of open to pipe things.
1

You can do this with node and RTC. There is some tools ready to use like SimpleWebRTC or EasyRTC. For what I already tested video is still a trouble, but audio works great.

answered May 1, 2014 at 16:29

Comments

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.