I am using Amazon RTMP distribution for serving video. But I found a cache issue.
I have uploaded the file 'abc.mp4' in a bucket that RTMP distribution will use. After that I replaced that file with an updated video file with the same name (i.e 'abc.mp4'). But RTMP is showing me the same previous video file.
As I have seen, there is no cache setting while creating RTMP distribution.
Has anybody faced the same issue? Please help me find a solution.
Thanks.
Related
I'm trying to backup local files to Amazon Glacier using the boto3 Python library on my NAS box (Zyxel NAS326 running Python 3.7 on entware-ng). While this generally works, I found that the upload speed is extremely slow. To rule out general problems with the NAS box or my internet connection, I did the following:
Running my backup program on my desktop computer: maximum upload rate used
Uploading a file to the internet from my NAS box using FTP: maximum upload rate used
On my router I could see that there are only short peaks of outgoing data followed by long delays.
To narrow down the problem I have logged the file access during the upload. This showed that there is no delay reading from disk, but during the sending of the data via the HTTPS connection. It turned out that a chunk of data is read from the file (usually about 9 MB) then there is a short activity on the internet connection, then a delay of minimum 10 seconds before more data is read from the file. So it seems that the connection is somehow blocking the upload, but I have no idea why, or what I could do about it.
Has anyone seen this behaviour before or has ideas what else I could try?
I sometimes upload files to my website (usually mp4 files, so we'll call them videos for now), but when I then get the URL for them, the video doesn't load swiftly at all.
My website is hosted on my pi at home
I'm using express.js
I'm app.use()ing the compression package.
speedtest.net returns 38.44 Mb/s down and 13.91 Mb/s up
Here's an example video. The video is 39Mb in size.
Would anyone know how I would get much faster loading speeds? I'm 100% sure the problem is server (pi) sided.
My requirement is to create three images of different sizes simultaneously on a S3 bucket. I found a node module namely node-s3-uploader which suits my requirement. As per the documentation, if I want to upload an image, I need to provide the image path. But I am uploading a file not from my local disk but I am posting it through Advanced Rest Client a Chrome Plugin. Can somebody Suggest me how can bypass the image path description for the node-s3-uploader?
If it is not possible, is their another mechanism that I need to follow?
I have a HTML5 app that allows users to upload and play their audio files. The server is Node running on Heroku.
To allow cross-browser audio play, what I understand is that I have to at least maintain two formats of each audio file, let's say .mp3 and .ogg. So, I need to transcode the files automatically on the server side.
The problem is that Heroku does not run ffmpeg. I found this project that creates a custom buildpack for heroku that supports ffmpeg but it seems to be for Rails apps: https://github.com/dzello/ffmpeg-heroku.
I was thinking to run an external server for transcoding, which my nodejs app sends the file to, it does the transcoding, and reuploads the new file on my nodejs server. But I don't know how to set up such a server, and whether there is already a ready solution which does this kind of work?
So, here are my questions:
1- Is there a solution to run ffmpeg on heroku+nodejs?
2- How can I set up a transcoding server that communicates with my nodejs+heroku server?
Thanks!
Why do you need to run it on heroku? Just setup some virtual server, for example on Digital Ocean
And use a linux server to setup node. It's pretty easy and will run every package needed. It already has droplet with preconfigured node.js+mongodb stack
Currently using PHP 5.3.x & Fedora
Ok. I'll try to keep this simple. I'm working on a tool that allows the upload & storing of audio files on S3 for playback. Essentially, the user uploads a file (currently only allowing mp3 & m4a) to the server, and the file is then pushed to S3 for storage via the PHP SDK for amazon aws.
The missing link is that I would like to perform a simple bitrate & format conversion of the file prior to uploading the file. (ensuring that all files are 160kbs and .mp3).
I've looked into ffmpeg, although it seems that the PHP library only allows for reading bitrates and other meta, not for actual conversion.
Does anyone have any thoughts on the best way to approach this? Would running a shell_exec() command that performs the conversion be sufficient to do this, or is there a more efficient/better way of doing this?
Thanks in advance! Any help or advice is much appreciated.
You need to perform the conversion and upload to S3 'outside' of the PHP application as it'll take to long for the user to hang around on the page. This could be a simple app that uses ffmpeg from the command line.
I'm not familar with linux, so perhaps someone else can provide a more specific answer, but here is the basic premise:
User uploads file to server.
You set some kind of flag (eg in a database) for the user to see that the file is being processed.
You 'tell' your external encoder that a file needs to be processed and uploaded - you could use an entry in a database or some kind of message queue for this.
The encoder (possibly a command line app that invokes ffmpeg) picks up the next file in the queue and encodes it.
When complete, it uploads it to S3.
The flag is then updated to show that processing is complete and that the file is available.