I have a Nx Monorepo with a couple apps including a NestJS API and a PostgreSQL database. I am setting up a e2e target on my API to run a suite of end-to-end tests. In order to do so, I want my API app to be served to then execute my tests. I am using the #nrwl/jest:jest executor to run the tests and the target includes a dependsOn that runs a e2e-setup target. The e2e-setup target uses the #nrwl/workspace:run-commands executor to run npx nx run rest-api:serve. The problem I have is that the serve command blocks: the terminal stays attached to the API process. I know I can push it in the background with npx nx run rest-api:serve &, but then I would have to script some logic to "wait" until the API is listening.
There does not seem to be a documented way to serve a Nx app in the background. Is that right? I am thinking perhaps the best approach would be to containerize my apps and serve everything I need (API, DB, Redis, etc.) through docker-compose to then execute my e2e tests. What would be the most idiomatic way to do this?
Related
I am trying to run a Jenkins pipeline to run my testcase. My testcase uses testcontainer frameworks. Jenkins has been installed over Azure Kubernetes Service (AKS).
When I try to execute the pipeline, Azure doesnt allow modifiy Docker socket, and this command doesnt work:
-v /var/run/docker.sock:/var/run/docker.sock
Reading AKS guidelines, I found next comment in Container limitations:
You can no longer access the docker engine, /var/run/docker.sock, or use Docker-in-Docker (DinD).
https://learn.microsoft.com/en-us/azure/aks/cluster-configuration#containerd-limitationsdifferences
How can I map Docker socket using AKS in order to execute docker images // testcontainer? Its possible to use testcontainers over AKS or I would need to change my cloud?
According to the Testcontainers documentation:
Testcontainers requires a Docker-API compatible container runtime.
This isn't available in Kubernetes, as you note, and it means you can't use Testcontainers in a Kubernetes-hosted CI system.
You might consider breaking up your test suite into a set of pure unit tests and separate integration tests. The unit tests don't depend on containers or any other external resources, possibly using stub implementations of interfaces or mocks of external API calls; they necessarily only simulate reality, but can run anywhere your build system runs. The integration tests would depend on a fully-deployed application and make API calls to your container's endpoints. This split would let you avoid needing to run external dependencies in the unit-test phase, and thereby avoid Testcontainers.
I am building a web application and would like to run 3 different commands.
run mongodb
Run react-scripts to build/watch react app.
Run express server.
I can do this individually in different terminal sessions, but would it be possible to run these with 1 command using something such as pm2? (I know it's only meant for production usecase but for this is it overkill?) Or would something like a docker container work?
Sorry if this is an awful question. But I am at a loss at what I should actually be looking for!
Thanks,
I have an application with nodejs and R code. The last one runs in a Docker container.
I am planning some end-2-end tests, where I need to have the docker containers running. The service inside the container is stateful, so I would need to restart it for each test (for instance in beforeEach).
I would like to know what is the common way of doing this. I was thinking of executing an external command from the code in nodejs. Somethink like exec(docker run ...), but I don't know whether it is correct and elegant.
Any help is welcome
Docker deamon exposes RESTFul apis that you might want to take a look at. The Docker Engine API api is documented and versioned.
It might be much cleaner to interact with this api rather than forking docker commands.
I am trying to start to use NoFlo in my existing microservice architecture and I want to start out with a HTTP server so that I can mount it on my proxy and play/test with it.
You can find the repository here.
I am using Docker (Compose) to manage some services (with Dockerfile and start-docker.sh), but they also all have local startup scripts (start-local.sh). Both the scripts run NPM scripts to start the servers with their injected ENV vars.
I have some questions:
Should the starting point of the application be the server.js file, or a .fbp Graph?
What do I put in my package.json to start the server?
When I have started all the Docker containers with Docker Compose and the NoFlo Server is running, will I be able to program a HTTP server using Flowhub.io?
Whether you want to run your process with a custom Node.js script (and embed NoFlo inside), or whether you run NoFlo as the top-level control flow doesn't really matter that much.
For the former case, build and run your Docker image just like you would any other Node.js one.
For the latter case, you may want to execute the graph via noflo-nodejs. If you want to make the graph live programmable from the outside (with for instance Flowhub), you should also expose the FBP protocol port.
You can find a simple example of running a NoFlo graph via Docker here:
https://github.com/flowhub/bigiot-bridge/blob/master/Dockerfile
For easier switching between running in Docker, vs. running locally, one great option is to place the noflo-nodejs command to be the start script in package.json.
I'm working through the StrongLoop's getting started instructions and created my sample app. Whilst the instructions tell me to use:
slc run .
to start my application, I noticed I can equally run my application with:
node app.js
and get the same result. Obviously by using the second approach I can integrated my StrongLoop application with tools such as forever.
So my question is, what extra benefits does slc run offer? Does it have functionalities such auto restart etc?
You can do more with slc than node app.js.
slc is a command line tool for StrongLoop, which has more features. If you just want to run the app, it doesn't matter much, but if you want to do more, you can.
Here's the documentation: http://docs.strongloop.com/display/SLC/StrongLoop+Controller
It doesn't have much features for development (such as auto restart and such), but it will help with managing servers and what not.
My favorite feature is scaling a node app using slc.
You can do "slc run . size 2". This will spin up 1 master and 1 worker process which is part of a single cluster. Now if my workload increases, and resources are low, which I know using strongOps monitoring (slc strongops) and I want to scale the app without having to stop the app and re-engineer, I can just do the following:
"slc clusterctl size 4". This will spin up 2 more worker processes and automatically attach them to the same application cluster at run-time. The master will auto distribute the workload now to the new processes.
This is built on top of node cluster module. But there is much more to it. Used cluster-store to store shared cluster state objects too.
Another feature is "slc debug". Launches Node Inspector and brings the application code in runtime context and helps me in debugging, load source maps and iterate through test runs.
Based on the latest release at the moment (v2.1.1), the main immediate benefit of running slc run instead of node app.js is you get a REPL at the same time (lib/run-reple.js#L150L24). Looks like all you have to do is have main set properly in package.json, since it uses Module._load().
If you run slc run app.js you get no benefit as far as I can tell: lib/commands/run.js#30.
Yay open source! https://github.com/strongloop/strong-cli
One of my favorite features is 'slc debug app.js' which brings up node-inspector for debugging . its nice CLI sugar. But of course you can totally run node and configure this manually.
I created a Linux init.d Daemon script which you can use to run your app with slc as service:
https://gist.github.com/gurdotan/23311a236fc65dc212da
Might be useful to some of you.
slc run
it can be only used for strong loop application
while node . or node [fileName]
can be used to execute any Nodejs file