Ffmpeg segment doesn't show file size update in real time - node.js

I'm trying to run ffmpeg mp3 stream with segmentation for each hour. Everything is working perfectly, except for one thing: when i run the command, the file size doesn't grow in real-time as i need, it only grows in packages of 256k.
Is there a way to turn a "real-time mode"?
I'm using ubuntu 18.04 with ffmpeg 3.4.6
This is the code i'm trying to run on linux terminal:
ffmpeg -i http://radiocentova.conectastm.com:8363/stream -y -acodec libmp3lame -b:a 16k -ac 1 -ar 11025 -vn -strftime 1 -f segment -segment_time 3600 -flush_packets 1 #test_%Y%m%d%H%M%S+00.mp3
Recording with segment:
Recording without segment:

The flush packets option has to be directed to the child muxer (mp3 in this case), so
-segment_format_options flush_packets=1 instead of -flush_packets 1.

Related

filesize is not growing as expected

I am trying to record a stream here on my machine to study ffmpeg library,
but with(out) success.
I have a file watcher to clean up bugged streams, that cleanup each 3 minutes files that have been not changed less then 3 minutes.
The real problem is, if I use the command below:
/usr/bin/ffmpeg -i http://sysrad.net:10090/ -y test.mp3
this command doesn't have any kind of codec or audio transformation, so my target file (test.mp3) become 256k quickly, but, if I use this command below:
/usr/bin/ffmpeg -i http://sysrad.net:10090/ -y -b:a 8k -ac 1 -ar 11025 test.mp3
My target file (test.mp3) keep 0k until the record has 256k, I am not sure if this is an Unix problem or ffmpeg problem.
Other information, if I run in loop:
while true; do wc -l teste.mp3; sleep 0.5; done;
test.mp3 file keeps 0 rows, until has 256k size...
I have no idea how to workaround that, to get the real time file size for each 1k that ffmpeg get from stream with those codecs, does you guys have any idea how can I handle that?
Thanks!!!!

FFmpeg How to use alimiter Filter?

I cannot find enough documentation on the alimiter filter.
https://ffmpeg.org/ffmpeg-filters.html#alimiter
I used -filter_complex alimiter=limit=0.5 and it applied to the file but it boosted the volume.
I thought it was supposed to hardlimit the volume down?
FFmpeg says through cmd limit range [0.0625 - 1]
ffmpeg -i audio.wav -y -acodec libmp3lame -b:a 320k -ar 44100 -ac 2 -joint_stereo 1 -filter_complex alimiter=limit=0.5 audio.mp3
Here's a look at the two files through Adobe Audition
Original
FFmpeg alimiter 0.5
I found the problem was here:
level
Auto level output signal. Default is enabled. This normalizes audio back to 0dB if enabled.
I tried chaining the filter like this using level=disabled
-filter_complex alimiter=level_in=1:level_out=1:limit=0.5:attack=7:release=100:level=disabled
It now hard limits without raising the volume.

FFMPEG merging audio and video to get resulting video

I need to merge audio and video using ffmpeg so that, it should result in a video with the same duration as of audio.
I have tried 2 commands for that requirement in my linux terminal. Both the commands work for a few of the input videos; but for some other input_videos, they produce output same as the input video, the audio doesn't get merged.
The commands, I have tried are -
ffmpeg -i wonders.mp4 -i Carefull.mp3 -c copy testvid.mp4
and
ffmpeg -i wonders.mp4 -i Carefull.mp3 -strict -2 testvid.mp4
and
ffmpeg -i video.mp4 -i audio.wav -c:v copy -c:a aac -strict
experimental output.mp4
and these are my input videos -
samplevid.mp4
https://vid.me/z44E
duration - 28 seconds
size - 1.1 MB
status - working
And
wonders.mp4
https://vid.me/gyyB
duration - 97 seconds
size - 96 MB
status - not working
I have observed that the large size (more than 2MB) of the input video is probably the issue.
But, still I want the fix.

How to stream on YouTube using a Raspberry Pi?

So I'm trying to stream on YouTube using a raspberry pi. The idea is for one raspberry pi to be used to stream the connected webcam and for another to display the stream, sort of like a surveillance camera. Both raspberry pi's are currently using Raspbian.
So is it possible for me to stream directly to YouTube on a Raspberry Pi.
You can use any Pi supported RTMP/Flash encoder to publish a YouTube live event. One example is ffmpeg which can be compiled on Raspbian.
Create your YouTube live event using the guide. You can find the various encoder settings here.
When everything is ready you can start streaming. For a 640x480#25 700k video stream the command will be something like:
ffmpeg -f v4l2 -framerate 25 -video_size 640x480 -i /dev/video0 -c:v libx264 -b:v 700k -maxrate 700k -bufsize 700k -an -f flv rtmp://<youtube_rtmp_server/<youtube_live_stream_id>
"So is it possible for me to stream directly to YouTube on a Raspberry
Pi?"
Yes. But you're going to need to do a bit of configuring and get different hardware depending on your project needs.
For my project, a day and night doorway "security camera" that streams live to youtube, I chose a Raspberry Pi Zero W running raspbian (headless) and a camera module with auto IR switching capabilities and IR lights.
I have edited the raspbian image so all of the configurations of the wifi and camera module interfaces, code, and dependencies I need are pre-installed, so I can just flash an sd card, slap it in a pi+camera+powersupply setup and it does its thing.
So, for this answer to be helpful at all, you're going to need to do plenty of research on FFMPEG, know what it is, learn what it does, and get it installed on your board... You should be able to run a few tests getting FFMPEG to just spit out maybe a 10-second long video from your camera. I wouldn't bother reading any more of my ramblings if you have not got that far yet, because things are about to get specific.
So, your board is online, you can see it on the network, it's got internet, it's got ffmpeg, it's ready to go.
Here is the ffmpeg "stream command" I use to start the live stream:
raspivid -o - -t 0 -vf -hf -fps 60 -b 12000000 -rot 180 | ffmpeg -re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -i - -vcodec copy -acodec aac -ab 384k -g 17 -strict experimental -f flv rtmp://a.rtmp.youtube.com/live2/SESSION_ID
I arrived at this "stream command" above by tweaking each parameter you see, one by one, and in different combinations and I eventually got a really crisp 1080p stream with no buffering issues at all except for the occasional bit of wifi lag that comes around on my setup. You are going to need to do a ton of research into what every parameter does to get things just right and trust me it's going to be a pain figuring out what does what in the beginning. I would lurk all around StackOverflow and other resources and just plug around and see what you can get to come out of your setup when it comes to these FFMPEG commands.
To test if this "stream command" or any other you find works for you, just change SESSION_ID at the end to your stream key and run it in the console.
After you get an output you are happy with, figure out on your own how you want to trigger your camera to start streaming, if you want it to start recording as soon as the board is ready to start sending data, you accomplish this by putting your "stream command" in /etc/rc.local and it will run that command as soon as it can.
For my project, I use 18650 cells charged by solar panels as the power source so I have to be conscious about the power I use so I wrote some NodeJS program monitor just that.
Alright, that's enough talking into the wind for now. Hopefully, any of this helped someone out there, cheers.
Audio working! This worked for me from a raspberry pi 4 with an rbp v1.3 camera and cheap usb audio interface. Also gets the default audio which you can set in the alsamixer:
raspivid -o - -t 0 -vf -hf -fps 30 -b 6000000 | ffmpeg -f alsa -ac 1 -ar 44100 -i default -acodec pcm_s16le -f s16le -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 60 -strict -2 -f flv rtmp://<destination/streamkey>

Problems with point to point streaming using FFmpeg

I want to live stream video from webcam and sound from microphone from one computer to another but there is some problems.
When I use this command line:
ffmpeg.exe -f dshow -rtbufsize 500M -i video="Camera":audio="Microphone" -c:v mpeg4 -c:a mp2 -f mpegts udp://127.0.0.1:1234
FFmpeg console starts filling with yellow color messages and stream becomes unstable: http://s16.postimg.org/qglcgr345/Untitled.png
To solve this problem I have added new parameter to the command line to set the frame rate -r 25:
ffmpeg.exe -f dshow -rtbufsize 500M -r 25 -i video="Camera":audio="Microphone" -c:v mpeg4 -c:a mp2 -f mpegts udp://127.0.0.1:1234
After I added -r 25 problem with yellow color messages disappears but then appears another problem. When I fresh start FFmpeg with this command line video and sound looks synchronous but after one or two minutes appears ~25 seconds lag between video and sound, sound goes behind video. I have tried that with different protocols UDP, TCP, RTP but problems are the same. Please help me!
I found answer for my problem with "-r" and asynchronous audio and video. Who is interested answer is here: https://trac.ffmpeg.org/wiki/DirectShow (in paragraph "Specifying input framerate").

Resources