Speed up file loading express - node.js

I sometimes upload files to my website (usually mp4 files, so we'll call them videos for now), but when I then get the URL for them, the video doesn't load swiftly at all.
My website is hosted on my pi at home
I'm using express.js
I'm app.use()ing the compression package.
speedtest.net returns 38.44 Mb/s down and 13.91 Mb/s up
Here's an example video. The video is 39Mb in size.
Would anyone know how I would get much faster loading speeds? I'm 100% sure the problem is server (pi) sided.

Related

Is downloading file from url in nodejs use user internet or works background?

I'm currently building a nodejs streaming app which has to get the file from a third party after that cache it to my virtual machine running node.js (Heroku) local storage.
I want to ask if I'm requesting for download of file in nodejs app, do user internet speed matter even though file is not being downloaded in browser
Can I download file in background when I deploy to heroku without user interaction?
Thanks, if you can explain how internet bandwidth is being consumed by internet providers. I'm concerned about this because the country I'm in internet cost is expensive so I want to reduce internet usage of my users.
In short - the machine/computer running the download code is the one consuming the internet bandwidth.
So, if your node.js app is running on Heroku, the download is between the Heroku machine and the 3rd party server(s), thus not consuming the user's bandwidth (that data doesn't flow through the user's device).
However, when the user will stream that file from your node.js app to their device - that'll definitely consume their bandwidth.

Delays when uploading archives to Amazon Glacier using boto3 from a NAS box

I'm trying to backup local files to Amazon Glacier using the boto3 Python library on my NAS box (Zyxel NAS326 running Python 3.7 on entware-ng). While this generally works, I found that the upload speed is extremely slow. To rule out general problems with the NAS box or my internet connection, I did the following:
Running my backup program on my desktop computer: maximum upload rate used
Uploading a file to the internet from my NAS box using FTP: maximum upload rate used
On my router I could see that there are only short peaks of outgoing data followed by long delays.
To narrow down the problem I have logged the file access during the upload. This showed that there is no delay reading from disk, but during the sending of the data via the HTTPS connection. It turned out that a chunk of data is read from the file (usually about 9 MB) then there is a short activity on the internet connection, then a delay of minimum 10 seconds before more data is read from the file. So it seems that the connection is somehow blocking the upload, but I have no idea why, or what I could do about it.
Has anyone seen this behaviour before or has ideas what else I could try?

How to handle lots of file downloads from my linux server

I have a file 50MB file hosted in my deticated linux server, each day there is almost 50K users that download this file (2.5GB a day).
There are lots of crashes and users reports that sometimes even the file can't be downloaded since the server is overload.
I wonder if someone can help me how do I calculate which server/bandwidth/anything I need to handle that?
Is there any solution where I can host the file and pay per download?
Is there any setting or anything that I can improve or do on my server that will help to fix this issue?
My current server specification is:
2 x Intel Xeon E5 2620V2
2 x (6 x 2.10 GHz)
128 GB REG ECC
256GB SSD HD
1 IP Address
1 Gbit/s port Shared Bandwidth
I'll appreciate any help from you guys.
Thank you very much.
Your hardware configuration should probably be fine. At least if the downloads are more or less evenly distributed over the day.
One of the most effective http servers for serving static content is nginx. Take a look at this guide: Serving static content.
If that doesn't help, you should consider Amazon S3, which is probably the most popular file hosting solution with a reasonable price tag.
This is how not to make the file available for download:
data = read_file(filename)
echo data
You want to be using sendfile(2) to have the kernel stream the file directly into the socket instead of doing it in userspace.
Each server has their own mechanism for invoking sendfile(2); with httpd this is mod_xsendfile and its associated response header (X-SENDFILE). You'll find that moving to this will allow you to not only handle your current userbase but also to add many more without worry.

Amazon RTMP stream serving cached media file

I am using Amazon RTMP distribution for serving video. But I found a cache issue.
I have uploaded the file 'abc.mp4' in a bucket that RTMP distribution will use. After that I replaced that file with an updated video file with the same name (i.e 'abc.mp4'). But RTMP is showing me the same previous video file.
As I have seen, there is no cache setting while creating RTMP distribution.
Has anybody faced the same issue? Please help me find a solution.
Thanks.

Uploading large files in JSF

I want to upload a file that is >16GB. How can I do this in JSF?
When using HTTP, you'll face two limitations. The one on the client side (webbrowser) and the one on the server side (webserver). The average webbrowser (IE/FF/Chrome/etc) has a limit of 2~4GB, depending on the make/version/platform. You cannot control this from the server side on. The enduser has to change the browser settings itself (sometimes this isn't possible at all). The average webserver (Tomcat/JBoss/Glassfish/etc) in turn has a limit of 2GB. You can configure this, but this still won't and can't remove the limitation on the webbrowser.
Your best bet is FTP. If you want to do this by a webpage, consider an applet which utilizes Apache Commons Net FTPClient. There are several ready-to-use opensource/commercial ones by the way.
You however still need to take into account that the disk file system on the FTP server side supports that large files. FAT32 for example has a limit of 4GB per file. NTFS and several *Nix file systems, however, can go up to 16EB.

Resources