nodejs local agent functionality - node.js

I have a website hosted on Heroku and Firebase (front (react) and backend(nodejs)) and I have some "long running scripts" that I need to perform. I had the idea to deploy a node process to my raspberry pi to execute this (because I need resources from inside my network).
How would I set this up securely?
I think I need to create a nodejs process that checks the central server regularly if there are any jobs to be done. Can I use sockets for this? What technology would you guys use?
I think the design would be:
1. Local agent starts and connects to server
2. Server sends messages to agent, or local agent polls with time interval
EDIT: I have multiple users that I would like to serve. The user should be able to "download" the agent and set it up so that it connects to the remote server.

You could just use firebase for this right? Create a new firebase db for "tasks" or whatever that is only accessible for you. When the central server (whatever that is) determines there's a job to be done, it adds it to your tasks db.
Then you write a simple node app you can run on your raspberry pi that starts up, authenticates with firebase, and listens for updates on your tasks database. When one is added, it runs your long running task, then removes that task from the database.
Wrap it up in a bash script that'll automatically run it again if it crashes, and you've got a super simple pubsub setup without needing to expose anything on your local network.

Related

Setup secondary redundant failover Node.JS server

Introduction
So I made a discord bot and hosted it on Heroku server. The bot is basic, it just listens to new messages and responds to them.
I am using a free tier of heroku and discovered that there are so called "free dyno hours". When it is end of the month, all the free hours are consumed and my bot will go offline until next month.
I own a Raspberry pi and so I thought it would be a good idea to host it there as well in case Heroku goes offline.
What I want to achieve
Host discord bot on hosting platform(Heroku).
use my Raspberry pi as a failover server.
Ensure that only one instance of the bot is running at a time.
If both servers are online Heroku(chosen as the main server) has priority.
If the application on Heroku somehow crashes or goes offline the bot on my Raspberry gets activated.
And vice versa.
How would that work if I had 3 and more servers?
How would I approach this if I had the bot connected to database?
This is just an example I would appreciate if more general solution would be provided.
NginX(load balancer)
When searching for help, I came across load balancers and NginX.
If I am right, it basically functions like a gateway, all requests go to the server NginX is hosted on and then they get redirected to one of many servers(depending on how loaded they are etc.).
I have few problems with this:
my bot doesn't receive any requests, it just listens to the discord api.
if the server with Nginx fails it all falls apart?
My solution 1
My first approach was to let the bot running on both Heroku and Raspberry at the same time.
It guarantees that the bot runs on at least one server if the other one fails.
But it is not ideal as the bot responds 2 times when both servers are up.
My solution 2
Heroku:
I added a simple Rest api endpoint with express.js to the bot app.
This way I can check if the bot on Heroku is running.
Raspberry pi:
I wrapped the bot app with this Node program that basically functions like a switch.
Link to a git repo: https://github.com/Matyanson/secondary-node-server.git
Every 3 minutes the program calls the endpoint and then by looking at the status code it checks whether the app is down.
Then it either starts the app or shuts down the app on raspberry(using pm2).
This kinda works but I am sure there is a better solution!
There is also a problem with Heroku(you can skip this):
Heroku uses Dynos(containers) to run the apps. There are different configurations of dynos including 'Web' and 'Worker'.
Worker is used for background job and never is never put to sleep.
Web dyno is the only dyno to receive HTTP traffic. It is put to sleep after 30min of receiving no web traffic.
I can switch on/off either of these.
both: there are 2 instances of my bot.
Worker: Can't connect to the endpoint.
Web: have to 'ping' the api at least every 30min or the bot sleeps.
Why am I asking?
Not because I need to solve this exact problem but because I want to learn a good way of doing this to use it in the future projects.
I also think this would be a good way to not 100% rely on external Hosting providers.

Can I run a front-end and back-end on Netlify?

I want to practice creating my own RESTful API service to go along with a client-side application that I've created. My plan is to use Node and Express to create a server. On my local machine, I know how to set up a local server, but I would like to be able to host my application (client and server) online as part of my portfolio.
The data that my client application would send to the server would not be significant in size, so there wouldn't be a need for a database. It would be sufficient to just have my server save received data dynamically in an array, and I wouldn't care about having that data persist if the user exits the webpage.
Is it possible to use a service like Netlify in order to host both a client and server for my purposes? I'm picturing something similar to how I can start up a local dev server on my computer so that the front-end can interface with it. Except now I want everything hosted online for others to view. I plan to create the Express server in the same repo as the front-end code.
No, Netlify doesn't allow you to run a server or backend. However, they do allow you to run serverless functions in the cloud. These can run for up to 10 sec. at a time. Furthermore Netlify also have a BETA solution called "background functions" That can run for up to 15 minutes. But honestly for a RESTful API there sure would be better solutions out there?
If you are still looking for the Netlify for Backend you can consider Qovery. They explained here why it is a good fit for their users.

Do SmartApps run remotely even when all interactions are local?

I am trying to have my SmartApp talk to my local REST server at my company. This REST server is not externally accessible. In an attempt to narrow down the issue, I have created a groovy program that interacts with the REST server. I have executed this on my own computer and coworkers' computers and they are all able to access the REST server as expected. When I try to access the REST server from my SmartApp (using the SmartThings httpGet() function), I only get ConnectionTimeoutExceptions. Is my SmartApp executing from an external perspective?
From the smartthings documentation, all apps except Smart Home Monitor and Smart Lights run remotely (https://support.smartthings.com/hc/en-us/articles/209979766-Local-processing).
Smart Home Monitor and Smart Lights are the only
SmartApps with local processing capabilities at this time. We are
working on additional local SmartApp options.
That's why you cannot access your local server from your smart app.
But what you can do is going the other way. Instead of having your SmartApp make call on your local server you can make your local server make call on your smartApp (by using WebServices SmartApp).
Perhaps it does not fit your need but you can image the following workflow:
Your local server do a call every minutes on your SmartApp on GET /needs.
Your SmartApp return the what it need.
Your local server send the need with a query POST /result
You can image a better flow but it is just a sample.

Is there a way to run a node task in a child process?

I have a node server, which needs to:
Serve the web pages
Keep querying an external REST API and save data to database and send data to clients for certain updates from REST API.
Task 1 is just a normal node tasks. But I don't know how to implement the task 2. This task won't expose any interface to outside. It's more like a background task.
Can anybody suggest? Thanks.
To make a second node.js app that runs at the same time as your first one, you can just create another node.js app and then run it from your first one using child_process.spawn(). It can regularly query the external REST API and update the database as needed.
The part about "Send data to clients for certain updates from REST API" is not so clear what you're trying to do.
If you're using socket.io to send data to connected browsers, then the browsers have to be connected to your web server which I presume is your first node.js process. To have the second node.js process cause data to be sent through the socket.io connections in the first node.js process, you need some interprocess way to communicate. You can use stdout and stdin via child_process.spawn(), you can use some feature in your database or any of several other IPC methods.
Because querying a REST API and updating a database are both asynchronous operations, they don't take much of the CPU of a node.js process. As such, you don't really have to do these in another node.js process. You could just have a setInterval() in your main node.js process, query the API every once in a while, update the database when results are received and then you can directly access the socket.io connections to send data to clients without having to use a separate process and some sort of IPC mechanism.
Task 1:
Express is good way to accomplish this task.
You can explore:
http://expressjs.com/
Task 2:
If you are done with Expressjs. Then you can write your logic with in Express Framework.
This task then can be done with node module forever. Its a simple tool that runs your background scripts forever. You can use forever to run scripts continuously (whether it is written in node.js or not)
Have a look:
https://github.com/foreverjs/forever

restart nodejs server programmatically

User case:
My nodejs server start with a configuration wizard that allow user to change the port and scheme. Even more, update the express routes
Question:
Is it possible to apply the such kind of configuration changes on the fly? restart the server can definitely bring all the changes online but i'm not sure how to trigger it from code.
Changing core configuration on the fly is rarely practiced. Node.js and most http frameworks do not support it neither at this point.
Modifying configuration and then restarting the server is completley valid solution and I suggest you to use it.
To restart server programatically you have to execute logics outside of the node.js, so that this process can continue once node.js process is killed. Granted you are running node.js server on Linux, the Bash script sounds like the best tool available for you.
Implementation will look something like this:
Client presses a switch somewhere on your site powered by node.js
Node.js then executes some JavaScript code which instructs your OS to execute some bash script, lets say it is script.sh
script.sh restarts node.js
Done
If any of the steps is difficult, ask about it. Though step 1 is something you are likely handling yourself already.
I know this question was asked a long time ago but since I ran into this problem I will share what I ended up doing.
For my problem I needed to restart the server since the user is allowed to change the port on their website. What I ended up doing is wrapping the whole server creation (https.createServer/server.listen) into a function called startServer(port). I would call this function at the end of the file with a default port. The user would change port by accessing endpoint /changePort?port=3000. That endpoint would call another function called restartServer(server,res,port) which would then call the startServer(port) with the new port then redirect user to that new site with the new port.
Much better than restarting the whole nodejs process.

Resources