Hi I currently have 1 container running my frontend application which includes a server side part written in nodeJS and a client side one written in React. To run the entire application I have to run 3 scripts:
CLIENT: One for building and watching the client side code
SERVER: One for building and wathing the server side part
START: One to start the node application
I've just created a Docker container to build and start all my application but I need a way to run these 3 watcher commands with a separate log output. How may achieve my goal ?
Related
I have a website hosted on Heroku and Firebase (front (react) and backend(nodejs)) and I have some "long running scripts" that I need to perform. I had the idea to deploy a node process to my raspberry pi to execute this (because I need resources from inside my network).
How would I set this up securely?
I think I need to create a nodejs process that checks the central server regularly if there are any jobs to be done. Can I use sockets for this? What technology would you guys use?
I think the design would be:
1. Local agent starts and connects to server
2. Server sends messages to agent, or local agent polls with time interval
EDIT: I have multiple users that I would like to serve. The user should be able to "download" the agent and set it up so that it connects to the remote server.
You could just use firebase for this right? Create a new firebase db for "tasks" or whatever that is only accessible for you. When the central server (whatever that is) determines there's a job to be done, it adds it to your tasks db.
Then you write a simple node app you can run on your raspberry pi that starts up, authenticates with firebase, and listens for updates on your tasks database. When one is added, it runs your long running task, then removes that task from the database.
Wrap it up in a bash script that'll automatically run it again if it crashes, and you've got a super simple pubsub setup without needing to expose anything on your local network.
I have a node app exposes a REST API. When it receives a http request, it starts another/different node app, let's call it 'service app'.
The REST app runs inside a container and the easiest way to start the service app is to just call child_process.exec (we just pm2 though) but then they run inside the same container. If REST app gets multiple requests this one container solution just won't scale.
So is it possible that the REST app can start the service app running inside its own container? If yes how to do that?
Someone also suggested me to run my REST app in docker swarm so when it gets the request it just starts another docker service for the service app. But I have no idea how to do that or even it is possible?
I am new to docker, any advice is highly appreciated. Thanks!
You can control docker from inside of container by for example bind mounting /var/run/docker.sock file into container itself (-v flag to docker run). But be very careful, if someone will gain access to it, it will be more or less equal to giving him root access to the machine. Safest way would be to create 2nd REST app that runs in separate container and can start new containers when asked. Then you could just invoke it from 1st app and be sure that it will start only container with your app and nothing else.
I'm working on something, using NodeJS, that's intended to run as a service that I can connect with.
Let's say I'm working on a Calculator npm module.
I will need to run it within my repo as following:
./node_modules/.bin/calculator start
And I want to keep it running forever, and i can connect it somehow (on port maybe?)
So, I can send/receive messages with the calculator using another node module, let's say 'calculator-connector', for example as following:
var calcConnector = require('calculator-connector'),
calc = calcConnector.connect();
calc.add(1, 2);
Any idea how can achieve this design?
I'd make it like this:
Calculator by itself shouldn't be opinionated, where and when it will run. I'd just create it in the moment I need it:
var calculator = require('calculator');
calculator.listen('localhost', 8000); // create the service listening on port 8000
// create client capable of submitting the tasks
calcClient = createCalculatorClient('localhost', 8000)
calcClient.add(1,2)
I believe that such setup is optimal for quick development and debugging.
When you'll need the things to be really separated (say, calculator itself will run on a separate server), you can do simple node script which'll run the calculator (basically, it's first 2 lines of the snippet above) and then create simple upstart job (in case of debianish server, or some alternative on other platforms), that'll keep the script alive.
PS:
check out how express works, it's beautifully designed:
http://expressjs.com/
read more about upstart:
http://upstart.ubuntu.com/getting-started.html
To run calculator forever, you should use PM2 or Forever.
PM2 allows you to keep applications alive forever, to reload them without downtime and to facilitate common system admin tasks.
For connection, you could create a http or TCP server.
I've written a web scraper using node.js which scrapes a particular website, gets some data from there and then tweets it. I've deployed this app on Heroku and app crashes by giving an error that $PORT couldn't be bound within 60 seconds. I've tried putting both worker: node index.js and web: node index.js in Procfile one by one, but it crashes.
Since my app is not a server hence does not need a port number, how can I keep it running with out making my app a server?
Turn on the worker process and off the default web process:
$ heroku scale web=0 worker=1
Take a look at the following link: http://pm2.keymetrics.io/docs/usage/use-pm2-with-cloud-providers/
I am working with node.js and heroku and would like to have more than one webprocess running.
In the Procfile I have it looking like this:
web: node web.js
web: node differentWeb.js
However when I run it it will only run the last of the two. Is there a way to do this?
An application on Heroku can only have one process named web that gets routed HTTP requests. If you need another web process then that would be another application on Heroku.