Node server GUI frontend - node.js

Well, we all know about headless servers. Actually, probably the vast majority of servers out there are headless.
As usual (it seems), my situation asked for quite something else. Basically, the proposed architecture looks more or less like:
The app server (node.js) is situated on a physical machine physically connected to two screens.
Between this machine and the 'net there are all sorts of regular networking layers. Please keep in mind that one of the main reasons for this setup is physical portability: ie, the client gets the necessary hardware as the product. The server itself relies on CDN for static files etc.
Each monitor/screen needs to show something different, produced by the same node server.
For now this server will probably run on Windows, but given a concept (which is what my question is after), I can change the code to run on the target platform. Well, depending on my code, this could even be done automatically.
So, my actual question. Node is quite flexible in that it can be run by anything - even custom made software (C++, Delphi, even GM). Just shell_exec('node server.js') and we're off.
But the screens themselves need to be quite dynamic. So node needs to influence both screens in some way. A few options I'm considering:
A custom app which creates two resizable, featureless windows with an embedded chromium browser to be controlled by the node server somehow (how node react with these browsers?)
A custom app which, according to node CLI output, updates the two screens' UI. Since I need something flashy as the UI, this app would be created in something like GameMaker, or a similar engine.
PS: Just in case you're asking; the physical connection opposed to a network one (eg; web-based GUI frontend) is by design.

I'd just wire up the result/monitoring screens as regular HTML pages. In your Node app, create a second HTTP server (on a non-standard port, firewalled from the public) that serves up the monitoring page.
Use socket.io to to send the realtime data to the monitoring page, which can make everything look pretty. Fire it up in a full-screen instance of Chrome.
This approach completely frees you from any kind of platform dependency, and decouples the monitoring app from the server app. It leaves you the latitude to run the monitoring app on a separate box if necessary.

Related

Is running flask in the default server safe (security-wise)?

I'm thinking about running a very simple Flask server locally (using the default development server) and then opening it up to the web via http://localhost.run/. I intend for this to be a personal webhook server and nothing else.
I've seen related questions before (for example, Is the server bundled with Flask safe to use in production?, etc), but:
I don't care that it won't scale well
I don't expect to get more than one request at a time
I will not be using the debugging mode
My question is this: How safe will my computer and local network be if I do this? I will probably limit requests to POST requests and check to see that they have a special key, or something like that, and the only thing I'm going to be doing with the webhooks is displaying a notification.
Short of someone from the Pallets Project speaking up, the official word on the recommendation is
https://flask.palletsprojects.com/en/1.1.x/tutorial/deploy/#run-with-a-production-server
If you have enough access to a server to permit running something that'll listen to a socket, the step of adding a WSGI server isn't a big one. The link above recommends waitress (and provides instructions), but gunicorn and uwsgi will work, too.
Adding my opinion:
Parsing HTTP and dealing with edge-cases is hard, so why should Flask/Werkzeug spend effort dealing with edge cases when there are WSGI front-ends that already take on the responsibility? In their position (which I'm not), I'd punt scaling and security to WSGI servers, and focus on making an excellent framework.

Deploying Next.js to Apache server

I've been developing a Next.js website locally and now want to set it up on my Apache server (with cPanel). However, I'm very new to Next.js and Node apps and not too sure how to go about it.
Has anyone done this successfully? Can you list the required steps and what files should be on the server?
Also, can this be done on a subdomain?
Thank you!
To start with some clear terms just so we're on the same page, there are two or three very different things people mean when they say "server":
A Server Machine is a computer that is connected to the internet that you intend to use to serve something to people on the internet.
A Server Program is some software you run on your Server Machine. The job of the Server Program is to actually calculate the responses to various requests.
A Server as a Service is a webapp provided by a company that stores your code and then puts it onto Server Machines with the right Server Program as needed.
While we're here, let's also define:
A Programming Language is the language your website is written in. Some sites have no language (and are just raw HTML/CSS files that are meant to be returned directly to the user). Many sites, though, have some code that should be run on the server and then the result of that code should be returned to the user.
In your case, you have a Machine whose condition we don't know other than that it is running the Program Apache (or probably "Apache HTTP Server"). Apache HTTP server is very old and proven and pretty good at serving raw files back to users. It can also run some Programming Languages like PHP and return the result.
However, Next.JS is built on top of the Programming Language Javascript, which Apache does not have the ability to run. Next.JS instead wants its Server Program to be Node.
So the problem here is basically that you have a hammer, but only screws. You can't use the tool you have, Apache, to solve the problem you need solved, running Node code and returning the result. To get around this you have two options:
First, you can find a way to access the Server Machine that is currently running Apache and tell it, instead, to run Node pointed at your Next.JS code whenever it starts up. This might not be possible, depending on who owns this machine and how they've set it up.
Second, and probably easier, is to abandon this Machine and instead use a Server as a Service. Heroku, AWS, and Netlify all support Next.JS and have a free tier. The easiest solution, though, is probably to just deploy it on Vercel, which is a Server as a Service run by the same team that makes Next.JS and which has a very generous free tier for you to get started with.
The good news, though, is that yes next.js does totally support being hosted from a subdomain.
Next.JS allows you to build fully functional Node Applications, as well as simple statically-generated sites like Jeckyl or Docpad. If your use case is a simple statically generated site look here: https://nextjs.org/docs/advanced-features/static-html-export
In particular the next build && next export command will create all the HTML and assets necessary to host a site directly via an HTTP server like Apache or Ngnix. Contents will be outputed to an out directory that could serve as the server root.
Pay very close attention to what features are not supported via this approach.

Use Google App Engine or Google Cloud Compute VM to Test Run My App?

I'm moving my Three.js app and its customized node.js environment, which I've been running on my local machine to Google Cloud. I want to test things out there, and hopefully soon get some early alpha testing going with other people.
I'm not sure which is the wiser way to go... to upload the repo I've been running locally as-is onto a VM which users would then access via the VM's external IP until I get a good name to call this app... or merge my local node.js environment with what's available via the Google App Engine and run it on GAE.
Issues I'm running into with the linux VM approach... I'm not sure how to do the equivalent on the VM of what I've been doing locally. In Windows Powershell I cd into the app directory and then enter node index.js. I'm assuming by this method of deployment that I can get the app running as soon as the browser hits the external IP. I should mention too that the app will allow users to save content as well as upload images, and eventually, 3D models as well as json datasets.
Issues I'm running into with the App Engine approach: it looks like I only have access to a linux-based command line, and have to install all the node.js modules manually. Meanwhile I have a bunch of files to upload, both the server-side node files and all the frontend stuff. I don't see where to upload those files, and ultimately what I'd like to do is have access to a visual, editable file-tree interface, as I have in Windows and FileZilla, so I can swap files in and out, etc. Alternatively I suppose I could import a repo from Github? Github would be fine as long as I can visually see what's happening. Is there a visual interface for file structure available in GAE somewhere? Am I missing something?
I went through the GAE "Hello World" tutorial and that worked fine, but was left scratching my head afterward regarding how to actually see and edit the guts of the tutorial app, or even where to look for the files.
So first off, I want to determine what's the better approach, and then if possible, determine how to make the experience of getting my app up there and running a more visual, user-friendly experience.
Thanks.
There are many things to consider when choosing how to run an app, but my instinct for your use case is to simply use a VM on GCE. The most compelling reason for this is that it's the most similar thing to what you have now. You can SSH into the machine and run nohup node index.js & (or node index.js inside tmux/screen if you prefer) and it will start the app and not stop it when you log out of SSH. You can use SCP / SFTP with whatever GUI client you want to upload files. You don't have to learn anything new! If you wanted to, you could even use a Windows VM (although I think you have to pay a little more than for a comparable Linux VM due to the licensing fees).
That said, the other way is arguably more "correct" by modern development standards, but it will involve a lot more learning that will prevent you from getting your app running somewhere other than your laptop in the short term:
First, you'll need to learn about Docker and stateless containers, which is basically what your app runs inside of on AppEngine.
Next, you'll need to learn how to hook up a separate stateful service (database, file server, ...) to your app's container so you can store your files, etc. in it, and then probably rewrite your app somewhat to use it to store stuff.
Next, you'll probably want some way to automatically deploy this from code instead of manually doing it, which gets you into build systems, package managers, artifact storage, continuous integration systems, and on and on and on.
This latter path is certainly what you should choose for a long-running production service if you work with a big team of developers -- but that doesn't mean that it's necessarily the right path for your project today. If you don't care about scaling up automatically, load balancing between nodes, redundant copies of your app running in different regions in case there's a natural disaster, etc., then go with the easy way for now, and you can learn new ways to improve the service when they're actually needed.

Proper architecture to serve several separate, unconnected multi-page websites under one single internet domain?

I own a humble one-server internet domain.
I am migrating from the Apache/php world to the node / angular / react world.
Within this domain, I have various separate projects. One is about my city, another is about my high school, a third is my portfolio, and so on.
One project might be a static SPA, the other might be a huge and highly dynamic professional production-level multi-page app with several frameworks, database connections and the whole kitchen sink.
One might be in react, the other in angular.
In short, a very diverse tech stack distributed among encapsulated applications.
I'm at the thinking stage, where I am considering the right way to run all these apps in one machine, in one domain, the modern way.
In the old Apache days, you would run one instance of Apache and it would serve everything with no problem.
Should I structure my code folders all as child folders of the one domain website?
In this new world, if I have ten websites do I have to run ten instances of a daemon like this Option A:
One instance of node, or whatever daemon, serves and routes all MPAs:
daemon1
my-domain.com
+--HighSchoolWebsite
+--SanFranciscoWebsite
+--PortfolioPage
Or should it be more like this Option B:
daemon1
+--HighSchoolWebsite
daemon2
+--SanFranciscoWebsite
daemon3
+--PortfolioPage
and if it is Option B, won't the daemons be listening at different ports? I would have to tell people to go to a url with a port number, like mydomain.com/myportfolio:2452. I've never seen this, so it must not be the way it's done, so what's the right way?
I have tried Googling this obviously but most answers tend to discuss MPAs for one single project, not SPA+MPA for several different projects.
Obviously I'm a beginner a bit lost at sea and would appreciate any tips, tricks, hints, rumors, etc.
Thanks very much.
Gee, thanks to all the downvote fairies for being so helpful to someone just trying to learn.
In case anyone else finds themselves at a similar stage, the answer is a "process manager" + a "reverse proxy".
A process manager is like a command that permanently serves your apps. It will keep your various apps running forever whether they be node, or php, or angular. Each app can be assigned a nickname. The process manager I chose was PM2.
A reverse proxy then maps the incoming requested url to the particular app listening on its chosen port. The reverse proxy I used was nginx.
So, for example, let's say you have phpApp running on port 1943. Let's say that in PM2 you nicknamed it pretty-happy-app. During dev, you would visit
http://localhost:1943
to see your pretty-happy-php app running.
Since you want a) external visitor to visit without knowing the port, and in fact you want to b) hide the port for security reasons, you use a reverse proxy like nginx.
You configure nginx to map the url to the port. During dev, assuming you're using your local machine, the configuration would be something like:
http://localhost/pretty-happy-app ---> localhost:1943
Once you go public, nginx would map like this:
http://your-domain.com/pretty-happy-app ---> your-public-machine:1943
I hope this helps someone in the future, not like the too-cool-for-school downvote fairies.

Deploying updates to production node.js code

This may be a basic question, but how do I go about effeciently deploying updates to currently running node.js code?
I'm coming from a PHP, JavaScript (client-side) background, where I can just overwrite files when they need updating and the changes are instantly available on the produciton site.
But in node.js I have to overwrite the existing files, then shut-down and the re-launch the application. Should I be worried by potential downtime in this? To me it seems like a more risky approach than the PHP (scripting) way. Unless I have a server cluster, where I can take down one server at a time for updates.
What kind of strategies are available for this?
In my case it's pretty much:
svn up; monit restart node
This Node server is acting as a comet server with long polling clients, so clients just reconnect like they normally would. The first thing the Node server does is grab the current state info from the database, so everything is running smoothly in no time.
I don't think this is really any riskier than doing an svn up to update a bunch of PHP files. If anything it's a little bit safer. When you're updating a big php project, there's a chance (if it's a high traffic site it's basically a 100% chance) that you could be getting requests over the web server while you're still updating. This means that you would be running updated and out-of-date code in the same request. At least with the Node approach, you can update everything and restart the Node server and know that all your code is up to date.
I wouldn't worry too much about downtime, you should be able to keep this so short that chances are no one will notice (kill the process and re-launch it in a bash script or something if you want to keep it to a fraction of a second).
Of more concern however is that many Node applications keep a lot of state information in memory which you're going to lose when you restart it. For example if you were running a chat application it might not remember who users were talking to or what channels/rooms they were in. Dealing with this is more of a design issue though, and very application specific.
If your node.js application 'can't skip a beat' meaning it is under continuous bombardment of incoming requests, you just simply cant afford that downtime of a quick restart (even with nodemon). I think in some cases you simply want a seamless restart of your node.js apps.
To do this I use naught: https://github.com/superjoe30/naught
Zero downtime deployment for your Node.js server using builtin cluster API
Some of the cloud hosting providers Node.js (like NodeJitsu or Windows Azure) keep both versions of your site on disk on separate directories and just redirect the traffic from one version to the new version once the new version has been fully deployed.
This is usually a built-in feature of Platform as a Service (PaaS) providers. However, if you are managing your servers you'll need to build something to allow for traffic to go from one version to the next once the new one has been fully deployed.
An advantage of this approach is that then rollbacks are easy since the previous version remains on the site intact.

Resources