Node Red - Accessing dashboard from remote server - remote-access

I have a question regarding the Node Red dashboard. I've got my dashboard all set up and working. Now, I want to be able to access the dashboard outside of my local network. Right now I do this through a VNC server. What needs to happen next is that clients need to able to access the dashboard, but they are not getting access to my VNC server of course. I have done my fair amount of Google work. I (somewhat) understand that a service like ngrok (ngrok.com) or dataplicity (dataplicity.com) is what I am looking for. What would be the best way of setting this up safely?
Might be useful to clarify: I'm using a raspberry Pi!
Thanks in advance!

If you want to give the outside world access to your dashboard, you can also consider to host your node-red application in the cloud. See links at the bottom-left of page https://nodered.org/docs/getting-started/
Most of those services have a free tier - so it might you cost nothing.
If you cannot deploy your complete node-red in the cloud (e.g. because it is reading local sensors) then you can split your node-red application into 2 node-red applications: one running locally and one (with the dashboard) running in the cloud. Of course then the 2 node-red applications need to exchange messages: for this the cloud services mentioned on that page also provides a secure way to send and receive events from the node-red cloud application that you can use.

Related

using VPS(Ubuntu) to create server using google calendar API authentication failed

So I am doing this project. I'm basically creating this server using task warrior and google calendar API to upload tasks that are made from the terminal to the google calendar.
Originally I did this on my personal computer(OS archlinux) and it worked, but I can't keep my computer running 24/7 so that why I opted of using a VPS. The VPS is running Ubuntu 20.04 without GUI. The same process that I did on my computer I to the server, everything went well until the part where google asked to allow the program to which I got a localhost refuse to connect message.
I'm going to assume that because it isn't my local network it going to refuse the connection.
My question how to allow that connection to be accepted by google? Is it something that I need to add or change in the API setting on google?

Running flask application without command prompt

I have been developing a dashboard based on the Python Flask. I am in the verge of my work. After completing my work, I don't know how to host my flask application in the production server. To be clear, I want my user need not to run the flask application in command prompt, then type the URL in the browser to see the dashboard. I want users of the dashboard have to type the URL only in the browser to see the dashboard.
It's important for you to know the basics of flask. It's is a framework to serve web applications. So, you need to host a web application...
You can host it on your lan network or you can host it on a dedicated server on the web. You can use heroku(https://devcenter.heroku.com/articles/getting-started-with-python) to host your app for free (he free plan has a lot of limitations). Heroku has support for some RDMS like postgres. Note that heroku free plan is not supposed to be used as a production server!
I suppose that you've been using the embbed flask server(dev server) that is NOT production ready, for that you should be looking to use gunicorn or some wsgi server.
If you want to deploy to your client`s server you'll need a reverse proxy(NGINX,Apache2), a WSGI server(GUNICORN,uWSGI...), your app in it and some more. But if you do not know what you are doing, maybe you need to talk to a sysadmin, since you'll need to manager the server.
Check this guide of how to do it.
hope it helps!

What is the best service for a GCP FTP Node App?

Ok, so a bit of background on what we are doing.
We have various weather station and soil monitoring stations across the country that gather up data and then using FTP, upload to a server for processing.
Note: this server is not located in the GCP, but we are migrating all our services over at the moment.
Annoyingly FTP is the only service that these particular stations allow. Newer stations thankfully are using REST APIs instead, so that makes it much simpler.
I have written a small nodejs app that works with ftp-srv. This acts as the FTP server.
I have also written a new FileSystem class that will hook directly into Google Cloud Storage. So instead of getting a local directory, it reads the GCS directory.
This allows for weather stations to upload their dump files direct to GCP for processing.
My question is, what is the best service to use?
First I thought using App Engine, since its just a small nodejs app, I don't really want to have to go and create a VM for it just to run this.
However, I have found that I have been unsuccessful to open up port 21 and any other ports used for passive FTP.
I then thought using Kubernetes Engine. To be honest, I don't know anything at all about this, as of yet. But it seems like its a bit of an overkill just to run the small app.
My last thought would be to use Compute Engine. I have a working copy with PROFTPD installed and working, so I know I can get the ports open and have data flowing, but I feel that it's a bit overkill to run a full VM just for something that is acting as an intermediary between the weather stations and GCS.
Any recommendations would be very appreciated.
Thanks!
Kubernetes just for FTP would be using a crane to lift your fork.
Google Compute Engine and PROFTPD will fit in a micro instance at a whopping cost of about $6.00 per month.
The other Google Compute services do not support FTP. This includes:
App Engine Standard
App Engine Flexible
Cloud Run
Cloud Functions
This leaves you with either Kubernetes or Compute Engine.

MongoDB in Docker drops databases automatically and periodically

I have a web page project developed in MEAN stack instanced in a server with Docker.
I have 2 container running, one has the API and the service, the other runs the mongodb service. Everything seems right but I have one big problem, in certain periods of time, the system drops my databases.
The last issue was on 11/03/2018 and the previous one 2 week ago.
These are the mongodb container last logs:
There is any configuration that maybe I can edit, or do you have passed through this problem?
If you need more information please do ask. Thanks.
Your database gets accessed from an external (public) IP. Your MongoDB server is publicly accessible and you have no authentication activated.
This official doc describes how you can enable authentication: https://docs.mongodb.com/manual/tutorial/enable-authentication/
Furthermore: You can restrict access to the server. If your app server and your mongodb server are in the same network, you need not to expose your mongodb to the public.

Hosting Neo4J Browser Separately From Server

I want to have three different Neo4J instances running (each with a different database). Then I need three different Neo4J browser visualizers (think project1.domainname.com / project2.domainname.com / project3.domainname.com) with each one mapping to a specific database instance
I've managed to get the three different database instances running on a single Azure VM - so far so good.
But I'm not sure how to create and map those browsers. I'd like to run them each as Azure websites as that would help with some other problems I've forseen.
1) Where is this browser HTML etc. so I can load it to the Azure Website?
2) Where in that code would I specify the IP Address and Port that browser should be talking to?
I've also heard some people talking about an Azure for Neo4J project but that is nearly five years old and the Neo4J guys said to put the database instances on VM's. Were they right>
The Neo4j Browser is an angular app that you can find in the neo4j source code at https://github.com/neo4j/neo4j/tree/2.3/community/browser
You can set the host that the browser app can talk to:
:config host:"http://host:port"
This is an undocumented feature and might be removed.
For 3.0 the intent is to decouple browser and Neo4j anyway.

Resources