Encoding wav to flac and streaming through node.js - node.js

Am trying to create an application which will take data as raw audio wav format and output as FLAC.
Now, I need to stream the input and the output at the same time through node.
Can someone guide me on how can I work this out?
Thanks

As far as I know there is no possibility to do so "live" via the Internet. You can't do so with FLAC format. Only AAC, MP3 an WAV are supported for live-streaming.
You can download the file on the client side and then the client using his or het apps is able to play it.

Related

How do I capture an mpeg-dash stream using Python3 opencv?

I have a URL that links to an MPEG-DASH stream (https://something.com/manifest.mpd). I would like to capture this stream to work with the frames with OpenCV on Python3, which I have installed using pip3. How would I do this?
I have already tried cv2.VideoCapture(URL), but this does not work.
you can try Vidgear. It supports mpeg dash format but it has yet to incorporate Apple HLS format. In case you want to work on a scalable solution, you can use amazon AWS Media Convert, which can convert your source files to any format such as m3u8 or mpd. You can use AWS Media Live to do the same thing for live streams.

Azure Media Services for transcoding and delivering audio

I have a common use case scenario where I want to do the following
Upload an audio file. (wav/mp3)
Transcodes to 128k or 192k mp3.
Stores the audio asset.
Allows the audio asset to be streamed.
Supports streaming actions such as play pause and seek.
The documentation for azure media services seems like it might be able to support this but I am not too sure, seems like they focus on video content. Anyone have experience with this?
You can manage audio and encode audio only assets with azure media services.
WAV is supported input format/conatiner as a input asset. To see full list of supported formats check following link:
https://azure.microsoft.com/en-us/documentation/articles/media-services-media-encoder-standard-formats/
Check https://github.com/Azure/azure-content/blob/master/articles/media-services/media-services-custom-mes-presets-with-dotnet.md#audio_only to see audio only preset options which you will use to encode an audio only preset.

How to stream audio mp3 file on web

Approx we all know about gaana.com, and saavn.com, that website stream audio mp3 files to client side but does't allow to users to grab the audio files, actually we want to know what technology he used to stream the audio mp3 files.
is he using streaming server or or something else ?
Can you describe the technology he is using in steaming the audio files.
Actually we are also creating a web app where audio files will be streammed in client side and we also don't want to allow users to download our mp3 files like gaana.com or saavn.com.
and we are also curious about if we want to stream our audio mp3 files in three different quality the what should i do. Should we convert all the mp3 files in all the three different quality and upload to the server or is any another solution exist for this purpose.
If you want to code your own streaming server then you can use this link
https://pypi.python.org/pypi/DeeFuzzer/ it's a python based streaming server, or you can also use ffmpeg or even VLC

Windows Azure live media encoders provide live transcoding?

I have a simple question - I want to stream live video + audio. I would like to use Windows Azure for that (mainly because it seems to provide HLS with AES protection which I have not encounted in opensource solutionsand clear for managers pricing per streaming user) I amtrobuled because of next quote:
Currently, Media Services does not provide a live transcoding service.
You can use one of the following third party live encoders that output
RTMP or Smooth Streaming formats: Elemental, Envivio, Cisco, RGP
encoders output Smooth Streaming; Adobe Flash Live, Wirecast and
Tredek encoders output RTMP.
And a few lines after
You can deliver your live stream in any of the following formats:
Smooth Streaming, DASH and HLS. When doing live streaming, HLS is
packaged dynamically and the default HLS packaging ratio is 3 Smooth
fragments to 1 HLS segment (3:1).
...
Configure a live transcoder.
Every time you reconfigure the transcoder, call the Reset method on
the channel.
So no transcoding is provided yet I shall set up a transcoder... What? How?
In FFmpeg there are 2 types of transcoding
from one encoded data format to another (say PCM raw data to encoded MP3 frames)
from one frame/packet type to another (say MP4 frames of already encoded audio/video to FLV frames format with same encoded data in them)
Do they try to tell me that they provide frames repacking from RTMP to HLS yet no live encoding into another compression type (say from Speex audio to AAC)?
As I answered on your another post, you can use tool like Wirecast 6 to encode your live stream and push the stream into Azure Ingest URL. We will give you a publish URL that could dynamically package content into HLS, Smooth Streaming and DASH.
For more information, please refer to this post: http://azure.microsoft.com/blog/2014/09/10/getting-started-with-live-streaming-using-the-azure-management-portal/
Yes. The second type of transcoding you describe can be better named transpackaging because no video coding is done.
Transcoding is not provided. Transpackaging is provided.

qt faststart and ffmpeg to generate a live mp4 file [duplicate]

This question already has answers here:
Live video streaming using progressive download (and not RTMP) in Flash
(2 answers)
Closed 9 years ago.
I am using ffmpeg to create an mp4 file on my server. I am also trying to use qt fast start to be able to move the moov atom to the front so it will stream. I have searched all over the internet with no luck. Is it possible to put my video/audio in a mp4 buffer type file and then be able to play it while ffmpeg is still dumping video and audio data into the stream? the point is I am trying to stream from a camera and Android is horrid... I know both ios and android support mp4 so I was trying to figure a way I can make my rtsp Mp4.
main point of the story: I want to continuously feed my mp4 container my camera feed and still be able to playback the file os my clients can watch.
any help appreciated thank you.
You can publish a live stream and when the stream has ended you publish the progressive download.
In FFmpeg, to stream live and save a duplicate of that stream into a file at the same time without encoding twice you can use the Tee pseudo-mixer. Something like this:
ffmpeg \
-i <input-stream> \
-f tee "[movflags=+faststart]output.mp4|http://<ffserver>/<feed_name>"
Update: You might try to directly stream a fragmented mp4.
Update 2:
Create a fragmented mp4:
ffmpeg -i input -frag_duration 1000 stream.mp4
Normally, when serving a file using a web server it will want to know the file size, so to serve the file without knowing it's size, you need to configure your web server to do Chunked Transfer Encoding.

Resources