Is downloading file from url in nodejs use user internet or works background? - node.js

I'm currently building a nodejs streaming app which has to get the file from a third party after that cache it to my virtual machine running node.js (Heroku) local storage.
I want to ask if I'm requesting for download of file in nodejs app, do user internet speed matter even though file is not being downloaded in browser
Can I download file in background when I deploy to heroku without user interaction?
Thanks, if you can explain how internet bandwidth is being consumed by internet providers. I'm concerned about this because the country I'm in internet cost is expensive so I want to reduce internet usage of my users.

In short - the machine/computer running the download code is the one consuming the internet bandwidth.
So, if your node.js app is running on Heroku, the download is between the Heroku machine and the 3rd party server(s), thus not consuming the user's bandwidth (that data doesn't flow through the user's device).
However, when the user will stream that file from your node.js app to their device - that'll definitely consume their bandwidth.

Related

NGINX and .NetCore application performance

I have a .netCore 5 website hosted with NGINX Linux Ubuntu
I need to select a new server and wonder where I can save money
My app is very small
My APP does NOT HDDs nothing store in the file system
I notice that even I overwrite with the new version (new files) the application does not change until I restart the tread.
My focus is to save money from the HDD
And buy servers with old HDD
Question
How NGINX works?
Load in the memory the app and serve from there?
You have the location of the files but create some cache and serve from there – WHICH means a Hard drive is important
In Short: Does NGINX used the hard drive?
My app is using CDN for all files they are on 3rd part Azure Blobs and Amazon S3
I want to save on HDD instead of SSD and nVME
Which HDDs to select to get the best performance?
Again my app server only web from 3rd party.
You pretty much already have the answer. NGINX won't do disk writes or reads unless it needs to.
For your case, static files are hosted elsewhere, so obviously your NGINX has no need to interact with the disk for serving those.
The app is served by using NGINX as a reverse proxy. The communication between the NGINX and the app is usually done over a network socket (e.g. your app binds to a TCP port) or a UNIX socket. For neither of the two, the disk speed would matter.
You would probably better ask yourself if your app logic does any reads or writes to disk. And if the answer is no or "not much", then yes, a plain HDD would be sufficient.

How could i display memory and cpu usage of my minecraft server on my website

So I am trying to display memory and cpu usage of my minecraft server on my website. But i dont know how could i do that. I have searched it up on youtube, but havent found anything.
There are many things that you can use to do.
Linux/ Hosting
If you are using a hosting company that gives you a nice looking website/panel to look at: web scraping their statistics and using it, such as taking it from this area using sort of bot. If they don't then you could look at getting a plugin or creating one such as Lag Monitor
They may be using Multicraft, statistics will be at the top if they have some measurement for it.
If you host the Minecraft server in a docker container then you should have a look at docker stats
If you host the Minecraft server just on the system itself using a service(systemctl) then you should refer to Retrieve CPU usage and memory usage of a single process on Linux?
You would need to create a script to get these things, return and format the value. You could either publish the statistics in almost real-time using some sort of socket connection like socket.io.
However, if that is not available then you could create an API server where ever you run the server(if on your own machine) to run these commands and allow your website to fetch the results every so often or on page load.
Windows
If you are hosting your Minecraft server on Windows then you are doing something wrong. Getting memory and CPU usage would be the least of your problems in this case and you should look into getting some proper hosting for your server.
If you are running the server on your own computer on your own network. Unless you have the experience and knowledge of doing so safely, which clearly you don't have, then you should definitely migrate to a Linux based hosting solution such as a VPS.
TL;DR:
Get a VPS, set up an API server, get statistics from that. There probably is no tutorial for you to follow.

What is the best service for a GCP FTP Node App?

Ok, so a bit of background on what we are doing.
We have various weather station and soil monitoring stations across the country that gather up data and then using FTP, upload to a server for processing.
Note: this server is not located in the GCP, but we are migrating all our services over at the moment.
Annoyingly FTP is the only service that these particular stations allow. Newer stations thankfully are using REST APIs instead, so that makes it much simpler.
I have written a small nodejs app that works with ftp-srv. This acts as the FTP server.
I have also written a new FileSystem class that will hook directly into Google Cloud Storage. So instead of getting a local directory, it reads the GCS directory.
This allows for weather stations to upload their dump files direct to GCP for processing.
My question is, what is the best service to use?
First I thought using App Engine, since its just a small nodejs app, I don't really want to have to go and create a VM for it just to run this.
However, I have found that I have been unsuccessful to open up port 21 and any other ports used for passive FTP.
I then thought using Kubernetes Engine. To be honest, I don't know anything at all about this, as of yet. But it seems like its a bit of an overkill just to run the small app.
My last thought would be to use Compute Engine. I have a working copy with PROFTPD installed and working, so I know I can get the ports open and have data flowing, but I feel that it's a bit overkill to run a full VM just for something that is acting as an intermediary between the weather stations and GCS.
Any recommendations would be very appreciated.
Thanks!
Kubernetes just for FTP would be using a crane to lift your fork.
Google Compute Engine and PROFTPD will fit in a micro instance at a whopping cost of about $6.00 per month.
The other Google Compute services do not support FTP. This includes:
App Engine Standard
App Engine Flexible
Cloud Run
Cloud Functions
This leaves you with either Kubernetes or Compute Engine.

Node Red - Accessing dashboard from remote server

I have a question regarding the Node Red dashboard. I've got my dashboard all set up and working. Now, I want to be able to access the dashboard outside of my local network. Right now I do this through a VNC server. What needs to happen next is that clients need to able to access the dashboard, but they are not getting access to my VNC server of course. I have done my fair amount of Google work. I (somewhat) understand that a service like ngrok (ngrok.com) or dataplicity (dataplicity.com) is what I am looking for. What would be the best way of setting this up safely?
Might be useful to clarify: I'm using a raspberry Pi!
Thanks in advance!
If you want to give the outside world access to your dashboard, you can also consider to host your node-red application in the cloud. See links at the bottom-left of page https://nodered.org/docs/getting-started/
Most of those services have a free tier - so it might you cost nothing.
If you cannot deploy your complete node-red in the cloud (e.g. because it is reading local sensors) then you can split your node-red application into 2 node-red applications: one running locally and one (with the dashboard) running in the cloud. Of course then the 2 node-red applications need to exchange messages: for this the cloud services mentioned on that page also provides a secure way to send and receive events from the node-red cloud application that you can use.

cloudfoundry: how to use filesystem

I am planning to use cloudfoundry paas service (from VMWare) for hosting my node.js application. I have seen that it has support for mongo and redis in the service layer and node.js framework. So far so good.
Now I need to store my mediafiles(images uploaded by users) to a filesystem. I have the metadata stored in Mongo.
I have been searching internet, but have not yet got good information.
You cannot do that for the following reasons:
There are multiple host machines running your application. They each have their own filesystems. Each running process in your application would see a different set of files.
The host machines on which your particular application is running can change moment-to-moment. Indeed, they will change every time you re-deploy your application. Every time a process is started on a new host machine, it will see an empty set of files. Every time a process is stopped on an old host machine, all the files would be permanently deleted.
You absolutely must solve this problem in another way.
Store the media files in MongoDB GridFS.
Store the media files in an object store such as Amazon S3 or Rackspace Cloud Files.
Filesystem in most cloud solutions are "ephemeral", so you can not use FS. You will have to use solutions like S3/ DB for such purpose

Resources